Michael Bryan

Michael Bryan


WebScraping With Python, Beautiful Soup, and Urllib3

Originally published by Leaundrae Mckinney  at dzone.com

In this day and age, information is key. Through the internet, we have an unlimited amount of information and data at our disposal. The problem, however, is because of the abundance of information we as the users become overwhelmed. Fortunately, for those users, there are programmers with the ability to develop scripts that will do the sorting, organizing, and extracting of this data for them. Work that would take hours to complete can be accomplished with just over 50 lines of code and run in under a minute. Today, using Python, Beautiful Soup, and Urllib3, we will do a little WebScraping and even scratch the surface of data extraction to an excel document.


The website that we will be working with is called books.toscrape.com. It's one of those websites that is literally made for practicing WebScraping. Before we begin, please understand that we won't be rotating our IP Addresses or User Agents. However, on other websites, this may be a good idea, since they will most likely block you if you're not "polite." (I'll talk more on the concept of being polite in later posts. For now, just know that it means to space out the amount of time between your individual scrapes.) 

Okay, let's take a look at our target.

Basically, we want a list of every book title and price from this website. We notice that the prices are in British Pounds, so we'll want to convert them into US Dollars. If we scroll to the bottom of the page, we notice that there are 50 pages worth of books. Therefore, our script will have to iterate 50 times, while altering the base URL each time. The URL for this page changes one number each time, so a simple for loop should do the trick.

Setup, Urllib3, and Beautiful Soup

Here's a breakdown of our tasks:

  • Import the required modules and create two master lists (titles and prices).
  • Using Urllib3 and Beautiful Soup, set up the environment to parse the first page.
  • Collect every book title from the page, and append it to one of the master lists.
  • Collect every book price from the page, convert to USD, and append to the prices master list.
  • Convert both master lists into a single dictionary.
  • Export to a CSV.

Now that we have our outline, we can get to work. Since we'll be putting everything into a function, be mindful of your indentations. Let's begin!

First, let's import our modules and define our function.

import urllib3, re
from bs4 import BeautifulSoup
from csv import DictReader, DictWriter
#The file name will be whatever you decide when running the function
def get_book_data(filename):
    #These will be our Master Lists and must remain outside of any loops
    titles = []
    prices = []

Urllib3 is an HTTP Client for Python. It's pretty versatile and perfect for what we need. For more information, check out the docs. Throughout most of your Web Scraping, there will be a time that Regex typically comes in handy. A prime example of this is that all of the prices on the page have a pound symbol in front of the numbers. One of the easiest ways to remove and replace the symbol is through Regular Expressions. Finally, since we want to write our information to a CSV via a dictionary, it only makes sense to use the CSV module.

Next, we define our function as getbookdata and pass in the argument filename that we will choose for our CSV.

#Convert British Pounds to USD (as of 20190801)
def gbp_to_usd(amount):
    return f'$ {round((amount * 1.21255), 2}'

As of August 1st, the conversion rate from British Pounds to US dollars is 1.21255. By defining this function, we are able to call it later when the time comes. Then, by wrapping our calculations in the round method, we can round the number to the hundredths place.

So far so good now let's get into it.

#Prepare to scrape all 50 pages
for i in range(1,51):
    #All of the page URLs follow the same format with the exception of one number followed by 'page-'
    url = f'http://books.toscrape.com/catalogue/category/books_1/page-{i}.html'
    req = urllib3.PoolManager()
    res = req.request('GET', url)
    soup = BeautifulSoup(res.data, 'html.parser')
    contents = soup.find_all(class_= 'product_pod')

Because there are 50 pages, our range will need to be from 1 to 51 in order to capture all of them. Our URL takes us to the first page. Throughout each iteration, one will be added to i, giving us a new URL each time.

The PoolManager method allows for arbitrary requests while transparently keeping track of connection pools for us. The type of request that we are initiating to our URL is a GET request, which means that all we want is data.

Using Beautiful Soup, we pull all of the data from our request, specifically the HTML data. If we take a look at the source code on our webpage, we'll notice that all of the products fall under the class productpod. By calling the findall method, we request all of the HTML with the class of product_pod.

#Based off of the title parameter within the site html
        for i in soup.find_all():

If we look once again at the source data, we notice that the title appears twice — once as the inner text and the other as the value for the parameter title. Normally, we would extract from the inner HTML, but since the inner text cuts off most of the title we have to extract from within a tag.

#Temporary lists for British currency conversions
        pounds = []
        c = []
        for i in contents:
            for number in c:
                #Extract the British Pound symbol and join the numbers and decimal points back together
                amount = re.compile('[0-9]+.')
                num = amount.findall(number)

Just like we did for the title, we are going to extract all of the HTML that falls within the price_color class. Now, remember that all of the prices are in British Pounds; so we need to remove the Pound symbol and replace it with the dollar sign. Before that, we extract all of the numbers and decimals from the gathered prices using regex. Once we have a list of numbers and decimals, we join and then append the floats to our temporary pounds list.

#Create a temporary list for the current loop and append to the master list after we run the conversion function
	temp = list(map(gbp_to_usd_rounded,pounds))
    for t in temp:
    #Combine both lists into a dictionary
    res = dict(zip(titles,prices))

Again, we create another temporary list and call our conversion function on the Pounds list we just created. Finally, we run a quick for loop and append those items to our master prices list.

Now that we have both master lists, we're able to create a dictionary using the zip method. Make sure that this is outside of any loops; otherwise, it won't work as intended.


We open the specified filename with the intent to write. With Python 3 when we are web scraping, we have to include the argument otherwise the spreadsheet will have blank lines between each entry. We run a simple iteration through the dictionary ensuring that our keys match the headers that we specified. Finally we execute getbookdata with our file title and extension, wait about 20-30 seconds and boom we have an excel document with all of the titles and prices from our target website!

#Create an Excel Document with the dictionary
    with open(filename,'w',) as file:
        headers = ("Book Title", 'Price (in usd)')
        csv_writer = DictWriter(file,)
        for k,v in res.items():
                'Book Title' : k,
                'Price (in usd)' : v

If you found value in this article, please share and leave a comment below. Let me know what projects you are working or would like to see in the future. You can find all of the source code on my GitHub. Until next time!

Originally published by Leaundrae Mckinney  at dzone.com


Thanks for reading :heart: If you liked this post, share it with all of your programming buddies! Follow me on Facebook | Twitter

Learn More

☞ Complete Python Bootcamp: Go from zero to hero in Python 3

☞ Python and Django Full Stack Web Developer Bootcamp

☞ Python for Time Series Data Analysis

☞ Python Programming For Beginners From Scratch

☞ Beginner’s guide on Python: Learn python from scratch! (New)

☞ Python for Beginners: Complete Python Programming

☞ Django 2.1 & Python | The Ultimate Web Development Bootcamp

☞ Python eCommerce | Build a Django eCommerce Web Application

☞ Python Django Dev To Deployment

#python #web-development

What is GEEK

Buddha Community

WebScraping With Python, Beautiful Soup, and Urllib3



The GitHub link did not follow the copy/paste, so here it is : https://github.com/PyTechDrae/PyTechCoding

Thanks to you, guys. 

Ray  Patel

Ray Patel


top 30 Python Tips and Tricks for Beginners

Welcome to my Blog , In this article, you are going to learn the top 10 python tips and tricks.

1) swap two numbers.

2) Reversing a string in Python.

3) Create a single string from all the elements in list.

4) Chaining Of Comparison Operators.

5) Print The File Path Of Imported Modules.

6) Return Multiple Values From Functions.

7) Find The Most Frequent Value In A List.

8) Check The Memory Usage Of An Object.

#python #python hacks tricks #python learning tips #python programming tricks #python tips #python tips and tricks #python tips and tricks advanced #python tips and tricks for beginners #python tips tricks and techniques #python tutorial #tips and tricks in python #tips to learn python #top 30 python tips and tricks for beginners

Sival Alethea

Sival Alethea


Beautiful Soup Tutorial - Web Scraping in Python

The Beautiful Soup module is used for web scraping in Python. Learn how to use the Beautiful Soup and Requests modules in this tutorial. After watching, you will be able to start scraping the web on your own.
📺 The video in this post was made by freeCodeCamp.org
The origin of the article: https://www.youtube.com/watch?v=87Gx3U0BDlo&list=PLWKjhJtqVAbnqBxcdjVGgT3uVR10bzTEB&index=12
🔥 If you’re a beginner. I believe the article below will be useful to you ☞ What You Should Know Before Investing in Cryptocurrency - For Beginner
⭐ ⭐ ⭐The project is of interest to the community. Join to Get free ‘GEEK coin’ (GEEKCASH coin)!
☞ **-----CLICK HERE-----**⭐ ⭐ ⭐
Thanks for visiting and watching! Please don’t forget to leave a like, comment and share!

#web scraping #python #beautiful soup #beautiful soup tutorial #web scraping in python #beautiful soup tutorial - web scraping in python

Ray  Patel

Ray Patel


Lambda, Map, Filter functions in python

Welcome to my Blog, In this article, we will learn python lambda function, Map function, and filter function.

Lambda function in python: Lambda is a one line anonymous function and lambda takes any number of arguments but can only have one expression and python lambda syntax is

Syntax: x = lambda arguments : expression

Now i will show you some python lambda function examples:

#python #anonymous function python #filter function in python #lambda #lambda python 3 #map python #python filter #python filter lambda #python lambda #python lambda examples #python map

Shardul Bhatt

Shardul Bhatt


Why use Python for Software Development

No programming language is pretty much as diverse as Python. It enables building cutting edge applications effortlessly. Developers are as yet investigating the full capability of end-to-end Python development services in various areas. 

By areas, we mean FinTech, HealthTech, InsureTech, Cybersecurity, and that's just the beginning. These are New Economy areas, and Python has the ability to serve every one of them. The vast majority of them require massive computational abilities. Python's code is dynamic and powerful - equipped for taking care of the heavy traffic and substantial algorithmic capacities. 

Programming advancement is multidimensional today. Endeavor programming requires an intelligent application with AI and ML capacities. Shopper based applications require information examination to convey a superior client experience. Netflix, Trello, and Amazon are genuine instances of such applications. Python assists with building them effortlessly. 

5 Reasons to Utilize Python for Programming Web Apps 

Python can do such numerous things that developers can't discover enough reasons to admire it. Python application development isn't restricted to web and enterprise applications. It is exceptionally adaptable and superb for a wide range of uses.

Robust frameworks 

Python is known for its tools and frameworks. There's a structure for everything. Django is helpful for building web applications, venture applications, logical applications, and mathematical processing. Flask is another web improvement framework with no conditions. 

Web2Py, CherryPy, and Falcon offer incredible capabilities to customize Python development services. A large portion of them are open-source frameworks that allow quick turn of events. 

Simple to read and compose 

Python has an improved sentence structure - one that is like the English language. New engineers for Python can undoubtedly understand where they stand in the development process. The simplicity of composing allows quick application building. 

The motivation behind building Python, as said by its maker Guido Van Rossum, was to empower even beginner engineers to comprehend the programming language. The simple coding likewise permits developers to roll out speedy improvements without getting confused by pointless subtleties. 

Utilized by the best 

Alright - Python isn't simply one more programming language. It should have something, which is the reason the business giants use it. Furthermore, that too for different purposes. Developers at Google use Python to assemble framework organization systems, parallel information pusher, code audit, testing and QA, and substantially more. Netflix utilizes Python web development services for its recommendation algorithm and media player. 

Massive community support 

Python has a steadily developing community that offers enormous help. From amateurs to specialists, there's everybody. There are a lot of instructional exercises, documentation, and guides accessible for Python web development solutions. 

Today, numerous universities start with Python, adding to the quantity of individuals in the community. Frequently, Python designers team up on various tasks and help each other with algorithmic, utilitarian, and application critical thinking. 

Progressive applications 

Python is the greatest supporter of data science, Machine Learning, and Artificial Intelligence at any enterprise software development company. Its utilization cases in cutting edge applications are the most compelling motivation for its prosperity. Python is the second most well known tool after R for data analytics.

The simplicity of getting sorted out, overseeing, and visualizing information through unique libraries makes it ideal for data based applications. TensorFlow for neural networks and OpenCV for computer vision are two of Python's most well known use cases for Machine learning applications.


Thinking about the advances in programming and innovation, Python is a YES for an assorted scope of utilizations. Game development, web application development services, GUI advancement, ML and AI improvement, Enterprise and customer applications - every one of them uses Python to its full potential. 

The disadvantages of Python web improvement arrangements are regularly disregarded by developers and organizations because of the advantages it gives. They focus on quality over speed and performance over blunders. That is the reason it's a good idea to utilize Python for building the applications of the future.

#python development services #python development company #python app development #python development #python in web development #python software development

Art  Lind

Art Lind


Python Tricks Every Developer Should Know

Python is awesome, it’s one of the easiest languages with simple and intuitive syntax but wait, have you ever thought that there might ways to write your python code simpler?

In this tutorial, you’re going to learn a variety of Python tricks that you can use to write your Python code in a more readable and efficient way like a pro.

Let’s get started

Swapping value in Python

Instead of creating a temporary variable to hold the value of the one while swapping, you can do this instead

>>> FirstName = "kalebu"
>>> LastName = "Jordan"
>>> FirstName, LastName = LastName, FirstName 
>>> print(FirstName, LastName)
('Jordan', 'kalebu')

#python #python-programming #python3 #python-tutorials #learn-python #python-tips #python-skills #python-development