Web scraping with Python – A beginner’s guide

Web scraping with Python – A beginner’s guide

An online postgraduate degree is a qualification with advanced knowledge in a particular area of study. These programs allow you to opt for higher studies without interrupting your career, while saving yourself time and money. This article will give you a few advantages of online learning.

Have you ever heard of web scraping? It is is an automated technique that is used to extract large amounts of data from websites. If you are interested in getting started with web scraping, then this tutorial is for you!

Imagine you have to pull out a huge amount of data from a particular website. Is it possible to do so, without manually going to each webpage and getting the data? Well yes, it is definitely possible using a technique called “Web Scraping”.

Web Scraping is an automated technique that is used to extract large amounts of data from websites whereby the data is extracted and saved to a local file in your computer. Web Scraping is becoming increasingly popular since the data extracted from the web can serve a lot of different purposes like:

  • Price Comparison: Web Scraping can be used to track product and service prices in different markets over time.
  • Social Media Scraping: Data from Social Media websites like Twitter can be collected by Web Scraping and used to find out what’s trending.
  • Research and Development: Web Scraping is used to collect large data sets like statistics, temperature, etc. from different websites, which are then used to carry out surveys.
  • Recruitment Web Scraping: Data from career-focused websites can be extracted by Web Scraping, in order to find the right people for certain job vacancies.
  • Extraction of Contact Information: Web Scraping can be used to scrape contact information such as emails, URLs, and phone numbers from websites.

Web Scraping has a lot of applications but implementing it can be slightly intimidating, so in this article, I will break down the process in elaborate steps to help you understand it better.

But before we get into that, here are some important points-to-remember about Web Scraping:

  1. Always read through the website’s Terms and Conditions to understand how you can legally use its data since most of the websites prohibit you from using the data for commercial purposes.
  2. Make sure you are not downloading the data at a rapid rate because this might break the website.

So, how does Web Scraping work?

To extract data using Web Scraping with Python, you need to follow the below steps:

  • Find the URL you want to scrape
  • Inspect the Page
  • Find the data you need to extract
  • Write the code
  • Run the code and extract the required data
  • Store the data in a specific format

Now, let us implement these steps in an example and see how to extract data from the Flipkart website using Python

Here are some libraries used for Web Scraping:

  • Selenium: A web testing library used to automate browser activities.
  • BeautifulSoup: It is a Python package for parsing HTML and XML documents. It creates parse trees that are helpful for extracting data easily.
  • Pandas: Pandas is a library used for data analysis and data manipulation. It is specifically used to extract data and store it in the desired format.

Now, let’s get started with the demonstration.

Scraping the Flipkart Website

Pre-requisites: Python 2.x or Python 3.x with Selenium, BeautifulSoup, pandas libraries installed; Google-chrome browser; Ubuntu Operating System

Step 1: Find the URL you want to scrape

We are going scrape the Flipkart website to extract the data for Price, Name, and Rating of Laptops. URL (find more information here).

Step 2: Inspect the Page

The data on the website is nested in tags. So, we need to inspect the page to see under which tag the data we want to scrape is nested. To inspect, just right click on the element and click on “Inspect”.

When you click on “Inspect”, a “Browser Inspector Box” will open on your screen.

Step 3: Find the data you need to extract

For this example, let us extract the Price, Name, and Rating which is nested in the “div” tag.

Step 4: Write the code

First, create a Python file. For this, open a terminal in Ubuntu and type gedit <your file name> with .py extension.

Let the file name is “web-s”. Now, here is the command:

gedit web-s.py

Now, let’s write our code in this file.

Before that, you need to import all the necessary libraries:

1

2

3

from selenium import webdriver

from BeautifulSoup import BeautifulSoup

import pandas as pd

We have to set the path to chromedriver, in order to configure webdriver to use Chrome browser

driver = webdriver.Chrome("/usr/lib/chromium-browser/chromedriver")

Refer the below code to open the URL:

1

2

3

4

products=[] #List to store name of the product

prices=[] #List to store price of the product

ratings=[] #List to store rating of the product

driver.get("https://www.flipkart.com/laptops/~buyback-guarantee-on-laptops-/pr?sid=6bo%2Cb5g&amp;amp;amp;uniq")

Now that we have written the code to open the URL, let’s extract the data from the website. As mentioned earlier, the data we want to extract is nested in tags. So, we have to find the <div> tags with those respective class-names, extract the data and store it in a variable. Refer to the code below:

1

2

3

4

5

6

7

8

9

content = driver.page_source

soup = BeautifulSoup(content)

for a in soup.findAll('a',href=True, attrs={'class':'_31qSD5'}):

name=a.find('div', attrs={'class':'_3wU53n'})

price=a.find('div', attrs={'class':'_1vC4OE _2rQ-NK'})

rating=a.find('div', attrs={'class':'hGSR34 _2beYZw'})

products.append(name.text)

prices.append(price.text)

ratings.append(rating.text)

Step 5: Run the code to extract the data

Use the below command to run the code:

Python web-s.py

Step 6: Store the data in the desired format

After extracting the data, you might want to store it in the desired format. For this example, we will store it in a CSV (Comma Separated Value) format. To do this, add the following lines to your code:

1

2

df = pd.DataFrame({'Product Name':products,'Price':prices,'Rating':ratings})

df.to_csv('products.csv', index=False, encoding='utf-8')

Now, run the whole code again and you will get a file named “products.csv” which will contain your extracted data.

Python really makes the Web Scraping easy because of its easily understandable syntax and a large collection of Libraries.

I hope this article was informative and helped you guys get familiar with the concept of Web Scraping using Python. Now, you can go ahead and try Web Scraping by experimenting with different modules and applications of Python. If you don’t already know this language, why not learn Python this year?


python

Bootstrap 5 Complete Course with Examples

Bootstrap 5 Tutorial - Bootstrap 5 Crash Course for Beginners

Nest.JS Tutorial for Beginners

Hello Vue 3: A First Look at Vue 3 and the Composition API

Building a simple Applications with Vue 3

Deno Crash Course: Explore Deno and Create a full REST API with Deno

How to Build a Real-time Chat App with Deno and WebSockets

Convert HTML to Markdown Online

HTML entity encoder decoder Online

Basic Data Types in Python | Python Web Development For Beginners

In the programming world, Data types play an important role. Each Variable is stored in different data types and responsible for various functions. Python had two different objects, and They are mutable and immutable objects.

How To Compare Tesla and Ford Company By Using Magic Methods in Python

Magic Methods are the special methods which gives us the ability to access built in syntactical features such as ‘<’, ‘>’, ‘==’, ‘+’ etc.. You must have worked with such methods without knowing them to be as magic methods. Magic methods can be identified with their names which start with __ and ends with __ like __init__, __call__, __str__ etc. These methods are also called Dunder Methods, because of their name starting and ending with Double Underscore (Dunder).

Python Programming: A Beginner’s Guide

Python is an interpreted, high-level, powerful general-purpose programming language. You may ask, Python’s a snake right? and Why is this programming language named after it?

Hire Python Developers

Are you looking for experienced, reliable, and qualified Python developers? If yes, you have reached the right place. At **[HourlyDeveloper.io](https://hourlydeveloper.io/ "HourlyDeveloper.io")**, our full-stack Python development services...

Python any: How to Check If Element is Iterable or Not

Python any() function returns True if any element of an iterable is True otherwise any() function returns False. The syntax is any().