Python Beautifulsoup Examples

Tutorial: Web Scraping with Python Using Beautiful Soup

Published: March 30, 2021 Learn how to scrape the web with Python! The internet is an absolutely massive source of data — data that we can access using web scraping and Python! In fact, web scraping is often the only way we can access data. There is a lot of information out there that isn’t available in convenient CSV exports or easy-to-connect APIs. And websites themselves are often valuable sources of data — consider, for example, the kinds of analysis you could do if you could download every post on a web access those sorts of on-page datasets, we’ll have to use web scraping. Don’t worry if you’re still a total beginner! In this tutorial we’re going to cover how to do web scraping with Python from scratch, starting with some answers to frequently-asked, we’ll work through an actual web scraping project, focusing on weather ‘ll work together to scrape weather data from the web to support a weather before we start writing any Python, we’ve got to cover the basics! If you’re already familiar with the concept of web scraping, feel free to scroll past these questions and jump right into the tutorial! The Fundamentals of Web Scraping:What is Web Scraping in Python? Some websites offer data sets that are downloadable in CSV format, or accessible via an Application Programming Interface (API). But many websites with useful data don’t offer these convenient nsider, for example, the National Weather Service’s website. It contains up-to-date weather forecasts for every location in the US, but that weather data isn’t accessible as a CSV or via API. It has to be viewed on the NWS site:If we wanted to analyze this data, or download it for use in some other app, we wouldn’t want to painstakingly copy-paste everything. Web scraping is a technique that lets us use programming to do the heavy lifting. We’ll write some code that looks at the NWS site, grabs just the data we want to work with, and outputs it in the format we this tutorial, we’ll show you how to perform web scraping using Python 3 and the Beautiful Soup library. We’ll be scraping weather forecasts from the National Weather Service, and then analyzing them using the Pandas to be clear, lots of programming languages can be used to scrape the web! We also teach web scraping in R, for example. For this tutorial, though, we’ll be sticking with Does Web Scraping Work? When we scrape the web, we write code that sends a request to the server that’s hosting the page we specified. The server will return the source code — HTML, mostly — for the page (or pages) we far, we’re essentially doing the same thing a web browser does — sending a server request with a specific URL and asking the server to return the code for that unlike a web browser, our web scraping code won’t interpret the page’s source code and display the page visually. Instead, we’ll write some custom code that filters through the page’s source code looking for specific elements we’ve specified, and extracting whatever content we’ve instructed it to example, if we wanted to get all of the data from inside a table that was displayed on a web page, our code would be written to go through these steps in sequence:1Request the content (source code) of a specific URL from the server2Download the content that is returned3Identify the elements of the page that are part of the table we want4Extract and (if necessary) reformat those elements into a dataset we can analyze or use in whatever way we that all sounds very complicated, don’t worry! Python and Beautiful Soup have built-in features designed to make this relatively straightforward. One thing that’s important to note: from a server’s perspective, requesting a page via web scraping is the same as loading it in a web browser. When we use code to submit these requests, we might be “loading” pages much faster than a regular user, and thus quickly eating up the website owner’s server Use Python for Web Scraping? As previously mentioned, it’s possible to do web scraping with many programming ever, one of the most popular approaches is to use Python and the Beautiful Soup library, as we’ll do in this tutorial. Learning to do this with Python will mean that there are lots of tutorials, how-to videos, and bits of example code out there to help you deepen your knowledge once you’ve mastered the Beautiful Soup Web Scraping Legal? Unfortunately, there’s not a cut-and-dry answer here. Some websites explicitly allow web scraping. Others explicitly forbid it. Many websites don’t offer any clear guidance one way or the scraping any website, we should look for a terms and conditions page to see if there are explicit rules about scraping. If there are, we should follow them. If there are not, then it becomes more of a judgement member, though, that web scraping consumes server resources for the host website. If we’re just scraping one page once, that isn’t going to cause a problem. But if our code is scraping 1, 000 pages once every ten minutes, that could quickly get expensive for the website, in addition to following any and all explicit rules about web scraping posted on the site, it’s also a good idea to follow these best practices:Web Scraping Best Practices:Never scrape more frequently than you need nsider caching the content you scrape so that it’s only downloaded pauses into your code using functions like () to keep from overwhelming servers with too many requests too our case for this tutorial, the NWS’s data is public domain and its terms do not forbid web scraping, so we’re in the clear to to scrape the web with Python, right in your browser! Our interactive APIs and Web Scraping in Python skill path will help you learn the skills you need to unlock new worlds of data with Python. (No credit card required! ) The Components of a Web PageBefore we start writing code, we need to understand a little bit about the structure of a web page. We’ll use the site’s structure to write code that gets us the data we want to scrape, so understanding that structure is an important first step for any web scraping we visit a web page, our web browser makes a request to a web server. This request is called a GET request, since we’re getting files from the server. The server then sends back files that tell our browser how to render the page for us. These files will typically include:HTML — the main content of the — used to add styling to make the page look — Javascript files add interactivity to web — image formats, such as JPG and PNG, allow web pages to show our browser receives all the files, it renders the page and displays it to ’s a lot that happens behind the scenes to render a page nicely, but we don’t need to worry about most of it when we’re web scraping. When we perform web scraping, we’re interested in the main content of the web page, so we look primarily at the MLHyperText Markup Language (HTML) is the language that web pages are created in. HTML isn’t a programming language, like Python, though. It’s a markup language that tells a browser how to display content. HTML has many functions that are similar to what you might find in a word processor like Microsoft Word — it can make text bold, create paragraphs, and so you’re already familiar with HTML, feel free to jump to the next section of this tutorial. Otherwise, let’s take a quick tour through HTML so we know enough to scrape consists of elements called tags. The most basic tag is the tag. This tag tells the web browser that everything inside of it is HTML. We can make a simple HTML document just using this tag:We haven’t added any content to our page yet, so if we viewed our HTML document in a web browser, we wouldn’t see anything:Right inside an html tag, we can put two other tags: the head tag, and the body main content of the web page goes into the body tag. The head tag contains data about the title of the page, and other information that generally isn’t useful in web scraping:We still haven’t added any content to our page (that goes inside the body tag), so if we open this HTML file in a browser, we still won’t see anything:You may have noticed above that we put the head and body tags inside the html tag. In HTML, tags are nested, and can go inside other ’ll now add our first content to the page, inside a p tag. The p tag defines a paragraph, and any text inside the tag is shown as a separate paragraph:

Here’s a paragraph of text!

Here’s a second paragraph of text!

Rendered in a browser, that HTML file will look like this: Here’s a paragraph of text! Here’s a second paragraph of text! Tags have commonly used names that depend on their position in relation to other tags:child — a child is a tag inside another tag. So the two p tags above are both children of the body — a parent is the tag another tag is inside. Above, the html tag is the parent of the body biling — a sibiling is a tag that is nested inside the same parent as another tag. For example, head and body are siblings, since they’re both inside html. Both p tags are siblings, since they’re both inside can also add properties to HTML tags that change their behavior. Below, we’ll add some extra text and hyperlinks using the a tag.

Here’s a paragraph of text! Python

Here’s how this will look:In the above example, we added two a tags. a tags are links, and tell the browser to render a link to another web page. The href property of the tag determines where the link goes. a and p are extremely common html tags. Here are a few others:div — indicates a division, or area, of the page. b — bolds any text inside. i — italicizes any text — creates a — creates an input a full list of tags, look we move into actual web scraping, let’s learn about the class and id properties. These special properties give HTML elements names, and make them easier to interact with when we’re element can have multiple classes, and a class can be shared between elements. Each element can only have one id, and an id can only be used once on a page. Classes and ids are optional, and not all elements will have can add classes and ids to our example:

Here’s a paragraph of text! Learn Data Science Online

Here’s a second paragraph of text! Python

Here’s how this will look:As you can see, adding classes and ids doesn’t change how the tags are rendered at requests libraryNow that we understand the structure of a web page, it’s time to get into the fun part: scraping the content we want! The first thing we’ll need to do to scrape a web page is to download the page. We can download pages using the Python requests requests library will make a GET request to a web server, which will download the HTML contents of a given web page for us. There are several different types of requests we can make using requests, of which GET is just one. If you want to learn more, check out our API ’s try downloading a simple sample website, ll need to first import the requests library, and then download the page using the method:import requests
page = (“)
page
After running our request, we get a Response object. This object has a status_code property, which indicates if the page was downloaded successfully:A status_code of 200 means that the page downloaded successfully. We won’t fully dive into status codes here, but a status code starting with a 2 generally indicates success, and a code starting with a 4 or a 5 indicates an can print out the HTML content of the page using the content ntent



A simple example page

Here is some simple content for this page.


Parsing a page with BeautifulSoupAs you can see above, we now have downloaded an HTML can use the BeautifulSoup library to parse this document, and extract the text from the p first have to import the library, and create an instance of the BeautifulSoup class to parse our document:from bs4 import BeautifulSoup
soup = BeautifulSoup(ntent, ”)We can now print out the HTML content of the page, formatted nicely, using the prettify method on the BeautifulSoup object.
This step isn’t strictly necessary, and we won’t always bother with it, but it can be helpful to look at prettified HTML to make the structure of the and where tags are nested easier to all the tags are nested, we can move through the structure one level at a time. We can first select all the elements at the top level of the page using the children property of soup. Note that children returns a list generator, so we need to call the list function on it:list(ildren)
[‘html’, ‘n’, A simple example page

Here is some simple content for this page.

]The above tells us that there are two tags at the top level of the page — the initial tag, and the tag. There is a newline character (n) in the list as well. Let’s see what the type of each element in the list is:[type(item) for item in list(ildren)]
[ctype, vigableString, ]As we can see, all of the items are BeautifulSoup objects:The first is a Doctype object, which contains information about the type of the second is a NavigableString, which represents text found in the HTML final item is a Tag object, which contains other nested most important object type, and the one we’ll deal with most often, is the Tag Tag object allows us to navigate through an HTML document, and extract other tags and text. You can learn more about the various BeautifulSoup objects can now select the html tag and its children by taking the third item in the list:html = list(ildren)[2]Each item in the list returned by the children property is also a BeautifulSoup object, so we can also call the children method on, we can find the children inside the html tag:list(ildren)
[‘n’, A simple example page , ‘n’,

Here is some simple content for this page.

, ‘n’]As we can see above, there are two tags here, head, and body. We want to extract the text inside the p tag, so we’ll dive into the body:body = list(ildren)[3]Now, we can get the p tag by finding the children of the body tag:list(ildren)
[‘n’,

Here is some simple content for this page.

, ‘n’]We can now isolate the p tag:p = list(ildren)[1]Once we’ve isolated the tag, we can use the get_text method to extract all of the text inside the t_text()
‘Here is some simple content for this page. ‘Finding all instances of a tag at onceWhat we did above was useful for figuring out how to navigate a page, but it took a lot of commands to do something fairly simple. If we want to extract a single tag, we can instead use the find_all method, which will find all the instances of a tag on a = BeautifulSoup(ntent, ”)
nd_all(‘p’)
[

Here is some simple content for this page.

]Note that find_all returns a list, so we’ll have to loop through, or use list indexing, it to extract nd_all(‘p’)[0]. get_text()
‘Here is some simple content for this page. ‘f you instead only want to find the first instance of a tag, you can use the find method, which will return a single BeautifulSoup (‘p’)

Here is some simple content for this page.

Searching for tags by class and idWe introduced classes and ids earlier, but it probably wasn’t clear why they were asses and ids are used by CSS to determine which HTML elements to apply certain styles to. But when we’re scraping, we can also use them to specify the elements we want to illustrate this principle, we’ll work with the following page:

First paragraph.

Second paragraph.


First outer paragraph.

Second outer paragraph.
We can access the above document at the URL. Let’s first download the page and create a BeautifulSoup object:page = (“)
soup = BeautifulSoup(ntent, ”)
soup
A simple example page<br />



Now, we can use the find_all method to search for items by class or by id. In the below example, we’ll search for any p tag that has the class nd_all(‘p’, class_=’outer-text’)
[

First outer paragraph.

,

Second outer paragraph.

]In the below example, we’ll look for any tag that has the class nd_all(class_=”outer-text”)

,

]We can also search for elements by nd_all(id=”first”)
[

]Using CSS SelectorsWe can also search for items using CSS selectors. These selectors are how the CSS language allows developers to specify HTML tags to style. Here are some examples:p a — finds all a tags inside of a p p a — finds all a tags inside of a p tag inside of a body body — finds all body tags inside of an html — finds all p tags with a class of outer-text. p#first — finds all p tags with an id of — finds any p tags with a class of outer-text inside of a body can learn more about CSS selectors autifulSoup objects support searching a page via CSS selectors using the select method. We can use CSS selectors to find all the p tags in our page that are inside of a div like (“div p”)

,

]Note that the select method above returns a list of BeautifulSoup objects, just like find and wnloading weather dataWe now know enough to proceed with extracting information about the local weather from the National Weather Service website! The first step is to find the page we want to scrape. We’ll extract weather information about downtown San Francisco from this page. Specifically, let’s extract data about the extended we can see from the image, the page has information about the extended forecast for the next week, including time of day, temperature, and a brief description of the conditions. Exploring page structure with Chrome DevToolsThe first thing we’ll need to do is inspect the page using Chrome Devtools. If you’re using another browser, Firefox and Safari have can start the developer tools in Chrome by clicking View -> Developer -> Developer Tools. You should end up with a panel at the bottom of the browser like what you see below. Make sure the Elements panel is highlighted:Chrome Developer ToolsThe elements panel will show you all the HTML tags on the page, and let you navigate through them. It’s a really handy feature! By right clicking on the page near where it says “Extended Forecast”, then clicking “Inspect”, we’ll open up the tag that contains the text “Extended Forecast” in the elements panel:The extended forecast textWe can then scroll up in the elements panel to find the “outermost” element that contains all of the text that corresponds to the extended forecasts. In this case, it’s a div tag with the id seven-day-forecast:The div that contains the extended forecast we click around on the console, and explore the div, we’ll discover that each forecast item (like “Tonight”, “Thursday”, and “Thursday Night”) is contained in a div with the class to Start Scraping! We now know enough to download the page and start parsing it. In the below code, we will:Download the web page containing the a BeautifulSoup class to parse the the div with id seven-day-forecast, and assign to seven_dayInside seven_day, find each individual forecast item. Extract and print the first forecast = (“)
seven_day = (id=”seven-day-forecast”)
forecast_items = nd_all(class_=”tombstone-container”)
tonight = forecast_items[0]
print(ettify())

Tonight


Tonight: Mostly clear, with a low around 49. West northwest wind 12 to 17 mph decreasing to 6 to 11 mph after midnight. Winds could gust as high as 23 mph.

Mostly Clear

Low: 49 °F

Extracting information from the pageAs we can see, inside the forecast item tonight is all the information we want. There are four pieces of information we can extract:The name of the forecast item — in this case, description of the conditions — this is stored in the title property of img. A short description of the conditions — in this case, Mostly temperature low — in this case, 49 ’ll extract the name of the forecast item, the short description, and the temperature first, since they’re all similar:period = (class_=”period-name”). get_text()
short_desc = (class_=”short-desc”). get_text()
temp = (class_=”temp”). get_text()
print(period)
print(short_desc)
print(temp)
Low: 49 °FNow, we can extract the title attribute from the img tag. To do this, we just treat the BeautifulSoup object like a dictionary, and pass in the attribute we want as a key:img = (“img”)
desc = img[‘title’]
print(desc)
Tonight: Mostly clear, with a low around 49. Extracting all the information from the pageNow that we know how to extract each individual piece of information, we can combine our knowledge with CSS selectors and list comprehensions to extract everything at the below code, we will:Select all items with the class period-name inside an item with the class tombstone-container in a list comprehension to call the get_text method on each BeautifulSoup riod_tags = (“. tombstone-container “)
periods = [t_text() for pt in period_tags]
periods
[‘Tonight’,
‘Thursday’,
‘ThursdayNight’,
‘Friday’,
‘FridayNight’,
‘Saturday’,
‘SaturdayNight’,
‘Sunday’,
‘SundayNight’]As we can see above, our technique gets us each of the period names, in order. We can apply the same technique to get the other three fields:short_descs = [t_text() for sd in (“. tombstone-container “)]
temps = [t_text() for t in (“. tombstone-container “)]
descs = [d[“title”] for d in (“. tombstone-container img”)]print(short_descs)print(temps)print(descs)
[‘Mostly Clear’, ‘Sunny’, ‘Mostly Clear’, ‘Sunny’, ‘Slight ChanceRain’, ‘Rain Likely’, ‘Rain Likely’, ‘Rain Likely’, ‘Chance Rain’]
[‘Low: 49 °F’, ‘High: 63 °F’, ‘Low: 50 °F’, ‘High: 67 °F’, ‘Low: 57 °F’, ‘High: 64 °F’, ‘Low: 57 °F’, ‘High: 64 °F’, ‘Low: 55 °F’]
[‘Tonight: Mostly clear, with a low around 49. ‘, ‘Thursday: Sunny, with a high near 63. North wind 3 to 5 mph. ‘, ‘Thursday Night: Mostly clear, with a low around 50. Light and variable wind becoming east southeast 5 to 8 mph after midnight. ‘, ‘Friday: Sunny, with a high near 67. Southeast wind around 9 mph. ‘, ‘Friday Night: A 20 percent chance of rain after 11pm. Partly cloudy, with a low around 57. South southeast wind 13 to 15 mph, with gusts as high as 20 mph. New precipitation amounts of less than a tenth of an inch possible. ‘, ‘Saturday: Rain likely. Cloudy, with a high near 64. Chance of precipitation is 70%. New precipitation amounts between a quarter and half of an inch possible. ‘, ‘Saturday Night: Rain likely. Cloudy, with a low around 57. Chance of precipitation is 60%. ‘, ‘Sunday: Rain likely. ‘, ‘Sunday Night: A chance of rain. Mostly cloudy, with a low around 55. ‘]Combining our data into a Pandas DataframeWe can now combine the data into a Pandas DataFrame and analyze it. A DataFrame is an object that can store tabular data, making data analysis easy. If you want to learn more about Pandas, check out our free to start course order to do this, we’ll call the DataFrame class, and pass in each list of items that we have. We pass them in as part of a dictionary key will become a column in the DataFrame, and each list will become the values in the column:import pandas as pd
weather = Frame({
“period”: periods,
“short_desc”: short_descs,
“temp”: temps,
“desc”:descs})
weather
desc
period
short_desc
temp
0
Tonight: Mostly clear, with a low around 49. W…
1
Thursday: Sunny, with a high near 63. North wi…
Thursday
Sunny
High: 63 °F
2
Thursday Night: Mostly clear, with a low aroun…
ThursdayNight
Low: 50 °F
3
Friday: Sunny, with a high near 67. Southeast …
Friday
High: 67 °F
4
Friday Night: A 20 percent chance of rain afte…
FridayNight
Slight ChanceRain
Low: 57 °F
5
Saturday: Rain likely. Cloudy, with a high ne…
Saturday
Rain Likely
High: 64 °F
6
Saturday Night: Rain likely. Cloudy, with a l…
SaturdayNight
7
Sunday: Rain likely. Cloudy, with a high near…
Sunday
8
Sunday Night: A chance of rain. Mostly cloudy…
SundayNight
Chance Rain
Low: 55 °F
We can now do some analysis on the data. For example, we can use a regular expression and the method to pull out the numeric temperature values:temp_nums = weather[“temp”](“(? Pd+)”, expand=False)
weather[“temp_num”] = (‘int’)
temp_nums
0 49
1 63
2 50
3 67
4 57
5 64
6 57
7 64
8 55
Name: temp_num, dtype: objectWe could then find the mean of all the high and low temperatures:weather[“temp_num”]()
58. 444444444444443We could also only select the rows that happen at night:is_night = weather[“temp”](“Low”)
weather[“is_night”] = is_night
is_night
0 True
1 False
2 True
3 False
4 True
5 False
6 True
7 False
8 True
Name: temp, dtype: boolweather[is_night]
Name: temp, dtype: bool
temp_num
49
True
50
57
55
Next Steps For This Web Scraping ProjectIf you’ve made it this far, congratulations! You should now have a good understanding of how to scrape web pages and extract data. Of course, there’s still a lot more to learn! If you want to go further, a good next step would be to pick a site and try some web scraping on your own. Some good examples of data to scrape are:News articlesSports scoresWeather forecastsStock pricesOnline retailer pricesYou may also want to keep scraping the National Weather Service, and see what other data you can extract from the page, or about your own ternatively, if you want to take your web scraping skills to the next level, you can check out our interactive course, which covers both the basics of web scraping and using Python to connect to APIs. With those two skills under your belt, you’ll be able to collect lots of unique and interesting datasets from sites all over the web! Learn to scrape the web with Python, right in your browser! Our interactive APIs and Web Scraping in Python skill path will help you learn the skills you need to unlock new worlds of data with Python. (No credit card required! )beginner, data mining, python, python tutorials, scraping, tutorial, Tutorials, web scraping
Beautiful Soup: Build a Web Scraper With Python

Beautiful Soup: Build a Web Scraper With Python

Watch Now This tutorial has a related video course created by the Real Python team. Watch it together with the written tutorial to deepen your understanding: Web Scraping With Beautiful Soup and Python
The incredible amount of data on the Internet is a rich resource for any field of research or personal interest. To effectively harvest that data, you’ll need to become skilled at web scraping. The Python libraries requests and Beautiful Soup are powerful tools for the job. If you like to learn with hands-on examples and have a basic understanding of Python and HTML, then this tutorial is for you.
In this tutorial, you’ll learn how to:
Inspect the HTML structure of your target site with your browser’s developer tools
Decipher data encoded in URLs
Use requests and Beautiful Soup for scraping and parsing data from the Web
Step through a web scraping pipeline from start to finish
Build a script that fetches job offers from the Web and displays relevant information in your console
Working through this project will give you the knowledge of the process and tools you need to scrape any static website out there on the World Wide Web. You can download the project source code by clicking on the link below:
Let’s get started!
What Is Web Scraping?
Web scraping is the process of gathering information from the Internet. Even copying and pasting the lyrics of your favorite song is a form of web scraping! However, the words “web scraping” usually refer to a process that involves automation. Some websites don’t like it when automatic scrapers gather their data, while others don’t mind.
If you’re scraping a page respectfully for educational purposes, then you’re unlikely to have any problems. Still, it’s a good idea to do some research on your own and make sure that you’re not violating any Terms of Service before you start a large-scale project.
Reasons for Web Scraping
Say you’re a surfer, both online and in real life, and you’re looking for employment. However, you’re not looking for just any job. With a surfer’s mindset, you’re waiting for the perfect opportunity to roll your way!
There’s a job site that offers precisely the kinds of jobs you want. Unfortunately, a new position only pops up once in a blue moon, and the site doesn’t provide an email notification service. You think about checking up on it every day, but that doesn’t sound like the most fun and productive way to spend your time.
Thankfully, the world offers other ways to apply that surfer’s mindset! Instead of looking at the job site every day, you can use Python to help automate your job search’s repetitive parts. Automated web scraping can be a solution to speed up the data collection process. You write your code once, and it will get the information you want many times and from many pages.
In contrast, when you try to get the information you want manually, you might spend a lot of time clicking, scrolling, and searching, especially if you need large amounts of data from websites that are regularly updated with new content. Manual web scraping can take a lot of time and repetition.
There’s so much information on the Web, and new information is constantly added. You’ll probably be interested in at least some of that data, and much of it is just out there for the taking. Whether you’re actually on the job hunt or you want to download all the lyrics of your favorite artist, automated web scraping can help you accomplish your goals.
Challenges of Web Scraping
The Web has grown organically out of many sources. It combines many different technologies, styles, and personalities, and it continues to grow to this day. In other words, the Web is a hot mess! Because of this, you’ll run into some challenges when scraping the Web:
Variety: Every website is different. While you’ll encounter general structures that repeat themselves, each website is unique and will need personal treatment if you want to extract the relevant information.
Durability: Websites constantly change. Say you’ve built a shiny new web scraper that automatically cherry-picks what you want from your resource of interest. The first time you run your script, it works flawlessly. But when you run the same script only a short while later, you run into a discouraging and lengthy stack of tracebacks!
Unstable scripts are a realistic scenario, as many websites are in active development. Once the site’s structure has changed, your scraper might not be able to navigate the sitemap correctly or find the relevant information. The good news is that many changes to websites are small and incremental, so you’ll likely be able to update your scraper with only minimal adjustments.
However, keep in mind that because the Internet is dynamic, the scrapers you’ll build will probably require constant maintenance. You can set up continuous integration to run scraping tests periodically to ensure that your main script doesn’t break without your knowledge.
An Alternative to Web Scraping: APIs
Some website providers offer application programming interfaces (APIs) that allow you to access their data in a predefined manner. With APIs, you can avoid parsing HTML. Instead, you can access the data directly using formats like JSON and XML. HTML is primarily a way to present content to users visually.
When you use an API, the process is generally more stable than gathering the data through web scraping. That’s because developers create APIs to be consumed by programs rather than by human eyes.
The front-end presentation of a site might change often, but such a change in the website’s design doesn’t affect its API structure. The structure of an API is usually more permanent, which means it’s a more reliable source of the site’s data.
However, APIs can change as well. The challenges of both variety and durability apply to APIs just as they do to websites. Additionally, it’s much harder to inspect the structure of an API by yourself if the provided documentation lacks quality.
The approach and tools you need to gather information using APIs are outside the scope of this tutorial. To learn more about it, check out API Integration in Python.
Scrape the Fake Python Job Site
In this tutorial, you’ll build a web scraper that fetches Python software developer job listings from the Fake Python Jobs site. It’s an example site with fake job postings that you can freely scrape to train your skills. Your web scraper will parse the HTML on the site to pick out the relevant information and filter that content for specific words.
You can scrape any site on the Internet that you can look at, but the difficulty of doing so depends on the site. This tutorial offers you an introduction to web scraping to help you understand the overall process. Then, you can apply this same process for every website you’ll want to scrape.
Throughout the tutorial, you’ll also encounter a few exercise blocks. You can click to expand them and challenge yourself by completing the tasks described there.
Step 1: Inspect Your Data Source
Before you write any Python code, you need to get to know the website that you want to scrape. That should be your first step for any web scraping project you want to tackle. You’ll need to understand the site structure to extract the information that’s relevant for you. Start by opening the site you want to scrape with your favorite browser.
Explore the Website
Click through the site and interact with it just like any typical job searcher would. For example, you can scroll through the main page of the website:
You can see many job postings in a card format, and each of them has two buttons. If you click Apply, then you’ll see a new page that contains more detailed descriptions of the selected job. You might also notice that the URL in your browser’s address bar changes when you interact with the website.
Decipher the Information in URLs
A programmer can encode a lot of information in a URL. Your web scraping journey will be much easier if you first become familiar with how URLs work and what they’re made of. For example, you might find yourself on a details page that has the following URL:
You can deconstruct the above URL into two main parts:
The base URL represents the path to the search functionality of the website. In the example above, the base URL is The specific site location that ends with is the path to the job description’s unique resource.
Any job posted on this website will use the same base URL. However, the unique resources’ location will be different depending on what specific job posting you’re viewing.
URLs can hold more information than just the location of a file. Some websites use query parameters to encode values that you submit when performing a search. You can think of them as query strings that you send to the database to retrieve specific records.
You’ll find query parameters at the end of a URL. For example, if you go to Indeed and search for “software developer” in “Australia” through their search bar, you’ll see that the URL changes to include these values as query parameters:
The query parameters in this URL are? q=software+developer&l=Australia. Query parameters consist of three parts:
Start: The beginning of the query parameters is denoted by a question mark (? ).
Information: The pieces of information constituting one query parameter are encoded in key-value pairs, where related keys and values are joined together by an equals sign (key=value).
Separator: Every URL can have multiple query parameters, separated by an ampersand symbol (&).
Equipped with this information, you can pick apart the URL’s query parameters into two key-value pairs:
q=software+developer selects the type of job.
l=Australia selects the location of the job.
Try to change the search parameters and observe how that affects your URL. Go ahead and enter new values in the search bar up top:
Change these values to observe the changes in the URL.
Next, try to change the values directly in your URL. See what happens when you paste the following URL into your browser’s address bar:
If you change and submit the values in the website’s search box, then it’ll be directly reflected in the URL’s query parameters and vice versa. If you change either of them, then you’ll see different results on the website.
As you can see, exploring the URLs of a site can give you insight into how to retrieve data from the website’s server.
Head back to Fake Python Jobs and continue exploring it. This site is a purely static website that doesn’t operate on top of a database, which is why you won’t have to work with query parameters in this scraping tutorial.
Inspect the Site Using Developer Tools
Next, you’ll want to learn more about how the data is structured for display. You’ll need to understand the page structure to pick what you want from the HTML response that you’ll collect in one of the upcoming steps.
Developer tools can help you understand the structure of a website. All modern browsers come with developer tools installed. In this section, you’ll see how to work with the developer tools in Chrome. The process will be very similar to other modern browsers.
In Chrome on macOS, you can open up the developer tools through the menu by selecting View → Developer → Developer Tools. On Windows and Linux, you can access them by clicking the top-right menu button (⋮) and selecting More Tools → Developer Tools. You can also access your developer tools by right-clicking on the page and selecting the Inspect option or using a keyboard shortcut:
Mac: Cmd+Alt+I
Windows/Linux: Ctrl+Shift+I
Developer tools allow you to interactively explore the site’s document object model (DOM) to better understand your source. To dig into your page’s DOM, select the Elements tab in developer tools. You’ll see a structure with clickable HTML elements. You can expand, collapse, and even edit elements right in your browser:
The HTML on the right represents the structure of the page you can see on the left.
You can think of the text displayed in your browser as the HTML structure of that page. If you’re interested, then you can read more about the difference between the DOM and HTML on CSS-TRICKS.
When you right-click elements on the page, you can select Inspect to zoom to their location in the DOM. You can also hover over the HTML text on your right and see the corresponding elements light up on the page.
Click to expand the exercise block for a specific task to practice using your developer tools:
Find a single job posting. What HTML element is it wrapped in, and what other HTML elements does it contain?
Play around and explore! The more you get to know the page you’re working with, the easier it will be to scrape it. However, don’t get too overwhelmed with all that HTML text. You’ll use the power of programming to step through this maze and cherry-pick the information that’s relevant to you.
Step 2: Scrape HTML Content From a Page
Now that you have an idea of what you’re working with, it’s time to start using Python. First, you’ll want to get the site’s HTML code into your Python script so that you can interact with it. For this task, you’ll use Python’s requests library.
Create a virtual environment for your project before you install any external package. Activate your new virtual environment, then type the following command in your terminal to install the external requests library:
$ python -m pip install requests
Then open up a new file in your favorite text editor. All you need to retrieve the HTML are a few lines of code:
import requests
URL = ”
page = (URL)
print()
This code issues an HTTP GET request to the given URL. It retrieves the HTML data that the server sends back and stores that data in a Python object.
If you print the attribute of page, then you’ll notice that it looks just like the HTML that you inspected earlier with your browser’s developer tools. You successfully fetched the static site content from the Internet! You now have access to the site’s HTML from within your Python script.
Static Websites
The website that you’re scraping in this tutorial serves static HTML content. In this scenario, the server that hosts the site sends back HTML documents that already contain all the data that you’ll get to see as a user.
When you inspected the page with developer tools earlier on, you discovered that a job posting consists of the following long and messy-looking HTML:


Senior Python Developer

Payne, Roberts and Davis

Stewartbury, AA

“, “xml”)
(text=”INSERT FOOTER HERE”). replace_with(footer)
# ‘INSERT FOOTER HERE’
print(doc)
#
#

Here’s the footer


Since the BeautifulSoup object doesn’t correspond to an actual
HTML or XML tag, it has no name and no attributes. But sometimes it’s
useful to look at its, so it’s been given the special
“[document]”:
Here’s the “Three sisters” HTML document again:
html_doc = “””
The Dormouse’s story
I’ll use this as an example to show you how to move from one part of
a document to another.
Going down¶
Tags may contain strings and other tags. These elements are the tag’s
children. Beautiful Soup provides a lot of different attributes for
navigating and iterating over a tag’s children.
Note that Beautiful Soup strings don’t support any of these
attributes, because a string can’t have children.
Navigating using tag names¶
The simplest way to navigate the parse tree is to say the name of the
tag you want. If you want the tag, just say
# The Dormouse’s story
You can do use this trick again and again to zoom in on a certain part
of the parse tree. This code gets the first tag beneath the tag:
# The Dormouse’s story
Using a tag name as an attribute will give you only the first tag by that
name:
If you need to get all the tags, or anything more complicated
than the first tag with a certain name, you’ll need to use one of the
methods described in Searching the tree, such as find_all():
#
Tillie]. contents and. children¶
A tag’s children are available in a list called. contents:
head_tag =
head_tag
ntents
# [The Dormouse’s story]
title_tag = ntents[0]
title_tag
# [‘The Dormouse’s story’]
The BeautifulSoup object itself has children. In this case, the
tag is the child of the BeautifulSoup object. :
len(ntents)
# 1
ntents[0]
# ‘html’
A string does not have. contents, because it can’t contain
anything:
text = ntents[0]
# AttributeError: ‘NavigableString’ object has no attribute ‘contents’
Instead of getting them as a list, you can iterate over a tag’s
children using the. children generator:
for child in ildren:
print(child)
# The Dormouse’s story. descendants¶
The. children attributes only consider a tag’s
direct children. For instance, the tag has a single direct
child–the tag:<br /> But the <title> tag itself has a child: the string “The Dormouse’s<br /> story”. There’s a sense in which that string is also a child of the<br /> <head> tag. The. descendants attribute lets you iterate over all<br /> of a tag’s children, recursively: its direct children, the children of<br /> its direct children, and so on:<br /> for child in scendants:<br /> The <head> tag has only one child, but it has two descendants: the<br /> <title> tag and the <title> tag’s child. The BeautifulSoup object<br /> only has one direct child (the <html> tag), but it has a whole lot of<br /> descendants:<br /> len(list(ildren))<br /> len(list(scendants))<br /> # 26<br /> ¶<br /> If a tag has only one child, and that child is a NavigableString,<br /> the child is made available as<br /> # ‘The Dormouse’s story’<br /> If a tag’s only child is another tag, and that tag has a, then the parent tag is considered to have the same<br /> as its child:<br /> If a tag contains more than one thing, then it’s not clear what<br /> should refer to, so is defined to be<br /> None:<br /> print()<br /> # None. strings and stripped_strings¶<br /> If there’s more than one thing inside a tag, you can still look at<br /> just the strings. Use the. strings generator:<br /> for string in rings:<br /> print(repr(string))<br /> ‘\n’<br /> # “The Dormouse’s story”<br /> # ‘\n’<br /> # ‘Once upon a time there were three little sisters; and their names were\n’<br /> # ‘Elsie’<br /> # ‘, \n’<br /> # ‘Lacie’<br /> # ‘ and\n’<br /> # ‘Tillie’<br /> # ‘;\nand they lived at the bottom of a well. ‘<br /> # ‘… ‘<br /> These strings tend to have a lot of extra whitespace, which you can<br /> remove by using the. stripped_strings generator instead:<br /> for string in ripped_strings:<br /> # ‘Once upon a time there were three little sisters; and their names were’<br /> # ‘, ‘<br /> # ‘and’<br /> # ‘;\n and they lived at the bottom of a well. ‘<br /> Here, strings consisting entirely of whitespace are ignored, and<br /> whitespace at the beginning and end of strings is removed.<br /> Going up¶<br /> Continuing the “family tree” analogy, every tag and every string has a<br /> parent: the tag that contains it.<br /> You can access an element’s parent with the attribute. In<br /> the example “three sisters” document, the <head> tag is the parent<br /> of the <title> tag:<br /> title_tag =<br /> The title string itself has a parent: the <title> tag that contains<br /> it:<br /> The parent of a top-level tag like <html> is the BeautifulSoup object<br /> itself:<br /> html_tag =<br /> # <class 'autifulSoup'><br /> And the of a BeautifulSoup object is defined as None:<br /> # None. parents¶<br /> You can iterate over all of an element’s parents with. parents. This example uses. parents to travel from an <a> tag<br /> buried deep within the document, to the very top of the document:<br /> link = soup. a<br /> link<br /> for parent in rents:<br /> # p<br /> # body<br /> # html<br /> # [document]<br /> Going sideways¶<br /> Consider a simple document like this:<br /> sibling_soup = BeautifulSoup(“<a><b>text1</b><c>text2</c></b></a>“, ”)<br /> # <a><br /> # text1<br /> # <c><br /> # text2<br /> # </c><br /> The <b> tag and the <c> tag are at the same level: they’re both direct<br /> children of the same tag. We call them siblings. When a document is<br /> pretty-printed, siblings show up at the same indentation level. You<br /> can also use this relationship in the code you write.. next_sibling and. previous_sibling¶<br /> You can use. previous_sibling to navigate<br /> between page elements that are on the same level of the parse tree:<br /> xt_sibling<br /> # <c>text2</c><br /> evious_sibling<br /> # <b>text1</b><br /> The <b> tag has a. next_sibling, but no. previous_sibling,<br /> because there’s nothing before the <b> tag on the same level of the<br /> tree. For the same reason, the <c> tag has a. previous_sibling<br /> but no. next_sibling:<br /> print(evious_sibling)<br /> print(xt_sibling)<br /> The strings “text1” and “text2” are not siblings, because they don’t<br /> have the same parent:<br /> # ‘text1’<br /> In real documents, the. next_sibling or. previous_sibling of a<br /> tag will usually be a string containing whitespace. Going back to the<br /> “three sisters” document:<br /> # <a href=" class="sister" id="link1">Elsie</a><br /> # <a href=" class="sister" id="link2">Lacie</a><br /> # <a href=" class="sister" id="link3">Tillie</a><br /> You might think that the. next_sibling of the first <a> tag would<br /> be the second <a> tag. But actually, it’s a string: the comma and<br /> newline that separate the first <a> tag from the second:<br /> # ‘, \n ‘<br /> The second <a> tag is actually the. next_sibling of the comma:<br /> # <a class="sister" href=" id="link2">Lacie</a>. next_siblings and. previous_siblings¶<br /> You can iterate over a tag’s siblings with. next_siblings or. previous_siblings:<br /> for sibling in xt_siblings:<br /> print(repr(sibling))<br /> # <a class="sister" href=" id="link2">Lacie</a><br /> # ‘; and they lived at the bottom of a well. ‘<br /> for sibling in (id=”link3″). previous_siblings:<br /> Going back and forth¶<br /> Take a look at the beginning of the “three sisters” document:<br /> # <html><head><title>The Dormouse’s story
An HTML parser takes this string of characters and turns it into a
series of events: “open an tag”, “open a tag”, “open a
tag”, “add a string”, “close the <title> tag”, “open a </p> <p> tag”, and so on. Beautiful Soup offers tools for reconstructing the<br /> initial parse of the document.. next_element and. previous_element¶<br /> The. next_element attribute of a string or tag points to whatever<br /> was parsed immediately afterwards. It might be the same as. next_sibling, but it’s usually drastically different.<br /> Here’s the final <a> tag in the “three sisters” document. Its. next_sibling is a string: the conclusion of the sentence that was<br /> interrupted by the start of the <a> tag. :<br /> last_a_tag = (“a”, id=”link3″)<br /> last_a_tag<br /> But the. next_element of that <a> tag, the thing that was parsed<br /> immediately after the <a> tag, is not the rest of that sentence:<br /> it’s the word “Tillie”:<br /> xt_element<br /> That’s because in the original markup, the word “Tillie” appeared<br /> before that semicolon. The parser encountered an <a> tag, then the<br /> word “Tillie”, then the closing </a> tag, then the semicolon and rest of<br /> the sentence. The semicolon is on the same level as the <a> tag, but the<br /> word “Tillie” was encountered first.<br /> The. previous_element attribute is the exact opposite of. next_element. It points to whatever element was parsed<br /> immediately before this one:<br /> evious_element<br /> # <a class="sister" href=" id="link3">Tillie</a>. next_elements and. previous_elements¶<br /> You should get the idea by now. You can use these iterators to move<br /> forward or backward in the document as it was parsed:<br /> for element in xt_elements:<br /> print(repr(element))<br /> # </p> <p class="story">… </p> <p>Beautiful Soup defines a lot of methods for searching the parse tree,<br /> but they’re all very similar. I’m going to spend a lot of time explaining<br /> the two most popular methods: find() and find_all(). The other<br /> methods take almost exactly the same arguments, so I’ll just cover<br /> them briefly.<br /> Once again, I’ll be using the “three sisters” document as an example:<br /> By passing in a filter to an argument like find_all(), you can<br /> zoom in on the parts of the document you’re interested in.<br /> Kinds of filters¶<br /> Before talking in detail about find_all() and similar methods, I<br /> want to show examples of different filters you can pass into these<br /> methods. These filters show up again and again, throughout the<br /> search API. You can use them to filter based on a tag’s name,<br /> on its attributes, on the text of a string, or on some combination of<br /> these.<br /> A string¶<br /> The simplest filter is a string. Pass a string to a search method and<br /> Beautiful Soup will perform a match against that exact string. This<br /> code finds all the <b> tags in the document:<br /> nd_all(‘b’)<br /> # [<b>The Dormouse’s story</b>]<br /> If you pass in a byte string, Beautiful Soup will assume the string is<br /> encoded as UTF-8. You can avoid this by passing in a Unicode string instead.<br /> A regular expression¶<br /> If you pass in a regular expression object, Beautiful Soup will filter<br /> against that regular expression using its search() method. This code<br /> finds all the tags whose names start with the letter “b”; in this<br /> case, the <body> tag and the <b> tag:<br /> import re<br /> for tag in nd_all(mpile(“^b”)):<br /> # b<br /> This code finds all the tags whose names contain the letter ‘t’:<br /> for tag in nd_all(mpile(“t”)):<br /> # title<br /> A list¶<br /> If you pass in a list, Beautiful Soup will allow a string match<br /> against any item in that list. This code finds all the <a> tags<br /> and all the <b> tags:<br /> nd_all([“a”, “b”])<br /> # [<b>The Dormouse’s story</b>,<br /> # <a class="sister" href=" id="link1">Elsie</a>,<br /> True¶<br /> The value True matches everything it can. This code finds all<br /> the tags in the document, but none of the text strings:<br /> for tag in nd_all(True):<br /> # head<br /> # a<br /> A function¶<br /> If none of the other matches work for you, define a function that<br /> takes an element as its only argument. The function should return<br /> True if the argument matches, and False otherwise.<br /> Here’s a function that returns True if a tag defines the “class”<br /> attribute but doesn’t define the “id” attribute:<br /> def has_class_but_no_id(tag):<br /> return tag. has_attr(‘class’) and not tag. has_attr(‘id’)<br /> Pass this function into find_all() and you’ll pick up all the </p> <p> tags:<br /> nd_all(has_class_but_no_id)<br /> # [</p> <p class="title"><b>The Dormouse’s story</b></p> <p>,<br /> # </p> <p class="story">Once upon a time there were…bottom of a well. </p> <p>,<br /> # </p> <p class="story">… </p> <p>]<br /> This function only picks up the </p> <p> tags. It doesn’t pick up the <a><br /> tags, because those tags define both “class” and “id”. It doesn’t pick<br /> up tags like <html> and <title>, because those tags don’t define<br /> “class”.<br /> If you pass in a function to filter on a specific attribute like<br /> href, the argument passed into the function will be the attribute<br /> value, not the whole tag. Here’s a function that finds all a tags<br /> whose href attribute does not match a regular expression:<br /> def not_lacie(href):<br /> return href and not mpile(“lacie”)(href)<br /> nd_all(href=not_lacie)<br /> The function can be as complicated as you need it to be. Here’s a<br /> function that returns True if a tag is surrounded by string<br /> objects:<br /> from bs4 import NavigableString<br /> def surrounded_by_strings(tag):<br /> return (isinstance(xt_element, NavigableString)<br /> and isinstance(evious_element, NavigableString))<br /> for tag in nd_all(surrounded_by_strings):<br /> Now we’re ready to look at the search methods in detail.<br /> find_all()¶<br /> Method signature: find_all(name, attrs, recursive, string, limit, **kwargs)<br /> The find_all() method looks through a tag’s descendants and<br /> retrieves all descendants that match your filters. I gave several<br /> examples in Kinds of filters, but here are a few more:<br /> nd_all(“title”)<br /> nd_all(“p”, “title”)<br /> # [</p> <p class="title"><b>The Dormouse’s story</b></p> <p>]<br /> nd_all(“a”)<br /> nd_all(id=”link2″)<br /> # [<a class="sister" href=" id="link2">Lacie</a>]<br /> (mpile(“sisters”))<br /> Some of these should look familiar, but others are new. What does it<br /> mean to pass in a value for string, or id? Why does<br /> find_all(“p”, “title”) find a </p> <p> tag with the CSS class “title”?<br /> Let’s look at the arguments to find_all().<br /> The name argument¶<br /> Pass in a value for name and you’ll tell Beautiful Soup to only<br /> consider tags with certain names. Text strings will be ignored, as<br /> will tags whose names that don’t match.<br /> This is the simplest usage:<br /> Recall from Kinds of filters that the value to name can be a<br /> string, a regular expression, a list, a function, or the value<br /> True.<br /> The keyword arguments¶<br /> Any argument that’s not recognized will be turned into a filter on one<br /> of a tag’s attributes. If you pass in a value for an argument called id,<br /> Beautiful Soup will filter against each tag’s ‘id’ attribute:<br /> nd_all(id=’link2′)<br /> If you pass in a value for href, Beautiful Soup will filter<br /> against each tag’s ‘href’ attribute:<br /> nd_all(mpile(“elsie”))<br /> # [<a class="sister" href=" id="link1">Elsie</a>]<br /> You can filter an attribute based on a string, a regular<br /> expression, a list, a function, or the value True.<br /> This code finds all tags whose id attribute has a value,<br /> regardless of what the value is:<br /> nd_all(id=True)<br /> You can filter multiple attributes at once by passing in more than one<br /> keyword argument:<br /> nd_all(mpile(“elsie”), id=’link1′)<br /> Some attributes, like the data-* attributes in HTML 5, have names that<br /> can’t be used as the names of keyword arguments:<br /> data_soup = BeautifulSoup(‘</p> <div data-foo="value">foo! </div> <p>‘, ”)<br /> nd_all(data-foo=”value”)<br /> # SyntaxError: keyword can’t be an expression<br /> You can use these attributes in searches by putting them into a<br /> dictionary and passing the dictionary into find_all() as the<br /> attrs argument:<br /> nd_all(attrs={“data-foo”: “value”})<br /> # [</p> <div data-foo="value">foo! </div> <p>]<br /> You can’t use a keyword argument to search for HTML’s ‘name’ element,<br /> because Beautiful Soup uses the name argument to contain the name<br /> of the tag itself. Instead, you can give a value to ‘name’ in the<br /> name_soup = BeautifulSoup(‘<input name="email"/>‘, ”)<br /> nd_all(name=”email”)<br /> # []<br /> nd_all(attrs={“name”: “email”})<br /> # [<input name="email"/>]<br /> Searching by CSS class¶<br /> It’s very useful to search for a tag that has a certain CSS class, but<br /> the name of the CSS attribute, “class”, is a reserved word in<br /> Python. Using class as a keyword argument will give you a syntax<br /> error. As of Beautiful Soup 4. 1. 2, you can search by CSS class using<br /> the keyword argument class_:<br /> nd_all(“a”, class_=”sister”)<br /> As with any keyword argument, you can pass class_ a string, a regular<br /> expression, a function, or True:<br /> nd_all(mpile(“itl”))<br /> def has_six_characters(css_class):<br /> return css_class is not None and len(css_class) == 6<br /> nd_all(class_=has_six_characters)<br /> Remember that a single tag can have multiple<br /> values for its “class” attribute. When you search for a tag that<br /> matches a certain CSS class, you’re matching against any of its CSS<br /> classes:<br /> nd_all(“p”, class_=”strikeout”)<br /> # [</p> <p class="body strikeout"> <p>]<br /> nd_all(“p”, class_=”body”)<br /> You can also search for the exact string value of the class attribute:<br /> nd_all(“p”, class_=”body strikeout”)<br /> But searching for variants of the string value won’t work:<br /> nd_all(“p”, class_=”strikeout body”)<br /> If you want to search for tags that match two or more CSS classes, you<br /> should use a CSS selector:<br /> (“p. “)<br /> In older versions of Beautiful Soup, which don’t have the class_<br /> shortcut, you can use the attrs trick mentioned above. Create a<br /> dictionary whose value for “class” is the string (or regular<br /> expression, or whatever) you want to search for:<br /> nd_all(“a”, attrs={“class”: “sister”})<br /> The string argument¶<br /> With string you can search for strings instead of tags. As with<br /> name and the keyword arguments, you can pass in a string, a<br /> regular expression, a list, a function, or the value True.<br /> Here are some examples:<br /> nd_all(string=”Elsie”)<br /> # [‘Elsie’]<br /> nd_all(string=[“Tillie”, “Elsie”, “Lacie”])<br /> # [‘Elsie’, ‘Lacie’, ‘Tillie’]<br /> nd_all(mpile(“Dormouse”))<br /> # [“The Dormouse’s story”, “The Dormouse’s story”]<br /> def is_the_only_string_within_a_tag(s):<br /> “””Return True if this string is the only child of its parent tag. “””<br /> return (s ==)<br /> nd_all(string=is_the_only_string_within_a_tag)<br /> # [“The Dormouse’s story”, “The Dormouse’s story”, ‘Elsie’, ‘Lacie’, ‘Tillie’, ‘… ‘]<br /> Although string is for finding strings, you can combine it with<br /> arguments that find tags: Beautiful Soup will find all tags whose<br /> matches your value for string. This code finds the <a><br /> tags whose is “Elsie”:<br /> nd_all(“a”, string=”Elsie”)<br /> # [<a href=" class="sister" id="link1">Elsie</a>]<br /> The string argument is new in Beautiful Soup 4. 4. 0. In earlier<br /> versions it was called text:<br /> nd_all(“a”, text=”Elsie”)<br /> The limit argument¶<br /> find_all() returns all the tags and strings that match your<br /> filters. This can take a while if the document is large. If you don’t<br /> need all the results, you can pass in a number for limit. This<br /> works just like the LIMIT keyword in SQL. It tells Beautiful Soup to<br /> stop gathering results after it’s found a certain number.<br /> There are three links in the “three sisters” document, but this code<br /> only finds the first two:<br /> nd_all(“a”, limit=2)<br /> # <a class="sister" href=" id="link2">Lacie</a>]<br /> The recursive argument¶<br /> If you call nd_all(), Beautiful Soup will examine all the<br /> descendants of mytag: its children, its children’s children, and<br /> so on. If you only want Beautiful Soup to consider direct children,<br /> you can pass in recursive=False. See the differe</p> <h2>Frequently Asked Questions about python beautifulsoup examples</h2> <h3>How do I use BeautifulSoup in Python?</h3> <p>First, we need to import all the libraries that we are going to use. Next, declare a variable for the url of the page. Then, make use of the Python urllib2 to get the HTML page of the url declared. Finally, parse the page into BeautifulSoup format so we can use BeautifulSoup to work on it.Jun 10, 2017</p> <h3>What is a BeautifulSoup in Python?</h3> <p>Beautiful Soup is a Python package for parsing HTML and XML documents (including having malformed markup, i.e. non-closed tags, so named after tag soup). It creates a parse tree for parsed pages that can be used to extract data from HTML, which is useful for web scraping.</p> <h3>How do you scrape data using BeautifulSoup?</h3> <p>To scrape a website using Python, you need to perform these four basic steps:Sending an HTTP GET request to the URL of the webpage that you want to scrape, which will respond with HTML content. … Fetching and parsing the data using Beautifulsoup and maintain the data in some data structure such as Dict or List.More items…•Dec 19, 2019</p> </div><!-- .entry-content --> <footer class="entry-footer"> <span class="cat-links"><a href="https://bilderupload.net/category/proxy/" rel="category tag">Proxy</a></span><span class="tags-links">Tags : <a href="https://bilderupload.net/tag/beautifulsoup-documentation/" rel="tag">beautifulsoup documentation</a>, <a href="https://bilderupload.net/tag/beautifulsoup-find-text/" rel="tag">beautifulsoup find text</a>, <a href="https://bilderupload.net/tag/beautifulsoup-python-install/" rel="tag">beautifulsoup python install</a>, <a href="https://bilderupload.net/tag/beautifulsoup-tutorial-pdf/" rel="tag">beautifulsoup tutorial pdf</a>, <a href="https://bilderupload.net/tag/python-web-scraping/" rel="tag">python web scraping</a>, <a href="https://bilderupload.net/tag/python-web-scraping-tutorial/" rel="tag">python web scraping tutorial</a>, <a href="https://bilderupload.net/tag/web-scraping-python-projects/" rel="tag">web scraping python projects</a>, <a href="https://bilderupload.net/tag/web-scraping-python-beautifulsoup-github/" rel="tag">web scraping-python beautifulsoup github</a></span><span class="comments-link"><a href="https://bilderupload.net/python-beautifulsoup-examples/#respond">Leave a Comment<span class="screen-reader-text"> on Python Beautifulsoup Examples</span></a></span> </footer><!-- .entry-footer --> </div><!-- .blog-post-item --> </article><!-- #post-12976 --> <nav class="navigation post-navigation" aria-label="Posts"> <h2 class="screen-reader-text">Post navigation</h2> <div class="nav-links"><div class="nav-previous"><a href="https://bilderupload.net/craigslist-gb/" rel="prev"><span class="screen-reader-text">Previous Post</span><span aria-hidden="true" class="nav-subtitle">Previous</span> <span class="nav-title"><span class="nav-title-icon-wrapper"><svg class="icon icon-arrow-left" aria-hidden="true" role="img"> <use href="#icon-arrow-left" xlink:href="#icon-arrow-left"></use> </svg></span>Craigslist Gb</span></a></div><div class="nav-next"><a href="https://bilderupload.net/the-parameter-is-still-included-on-the-back-end-request/" rel="next"><span class="screen-reader-text">Next Post</span><span aria-hidden="true" class="nav-subtitle">Next</span> <span class="nav-title">the parameter is still included on the back-end request.<span class="nav-title-icon-wrapper"><svg class="icon icon-arrow-right" aria-hidden="true" role="img"> <use href="#icon-arrow-right" xlink:href="#icon-arrow-right"></use> </svg></span></span></a></div></div> </nav> <div id="comments" class="comments-area"> <div id="respond" class="comment-respond"> <h3 id="reply-title" class="comment-reply-title">Leave a Reply <small><a rel="nofollow" id="cancel-comment-reply-link" href="/python-beautifulsoup-examples/#respond" style="display:none;">Cancel reply</a></small></h3><form action="https://bilderupload.net/wp-comments-post.php" method="post" id="commentform" class="comment-form" novalidate><p class="comment-notes"><span id="email-notes">Your email address will not be published.</span> <span class="required-field-message">Required fields are marked <span class="required">*</span></span></p><p class="comment-form-comment"><label for="comment">Comment <span class="required">*</span></label> <textarea id="comment" name="comment" cols="45" rows="8" maxlength="65525" required></textarea></p><p class="comment-form-author"><label for="author">Name <span class="required">*</span></label> <input id="author" name="author" type="text" value="" size="30" maxlength="245" autocomplete="name" required /></p> <p class="comment-form-email"><label for="email">Email <span class="required">*</span></label> <input id="email" name="email" type="email" value="" size="30" maxlength="100" aria-describedby="email-notes" autocomplete="email" required /></p> <p class="comment-form-url"><label for="url">Website</label> <input id="url" name="url" type="url" value="" size="30" maxlength="200" autocomplete="url" /></p> <p class="comment-form-cookies-consent"><input id="wp-comment-cookies-consent" name="wp-comment-cookies-consent" type="checkbox" value="yes" /> <label for="wp-comment-cookies-consent">Save my name, email, and website in this browser for the next time I comment.</label></p> <p class="form-submit"><input name="submit" type="submit" id="submit" class="submit" value="Post Comment" /> <input type='hidden' name='comment_post_ID' value='12976' id='comment_post_ID' /> <input type='hidden' name='comment_parent' id='comment_parent' value='0' /> </p><p style="display: none;"><input type="hidden" id="akismet_comment_nonce" name="akismet_comment_nonce" value="b4da163bdc" /></p><p style="display: none !important;" class="akismet-fields-container" data-prefix="ak_"><label>Δ<textarea name="ak_hp_textarea" cols="45" rows="8" maxlength="100"></textarea></label><input type="hidden" id="ak_js_1" name="ak_js" value="28"/></p></form> </div><!-- #respond --> </div><!-- #comments --> </div><!-- .single-post-wrap --> </main><!-- #main --> </div><!-- #primary --> <aside id="secondary" class="widget-area"> <section id="recent-posts-2" class="widget widget_recent_entries"> <h2 class="widget-title">Recent Posts</h2> <ul> <li> <a href="https://bilderupload.net/simple-web-crawler/">Simple Web Crawler</a> </li> <li> <a href="https://bilderupload.net/web-crawler-app/">Web Crawler App</a> </li> <li> <a href="https://bilderupload.net/pirate-by-proxy/">Pirate By Proxy</a> </li> <li> <a href="https://bilderupload.net/bayproxy-eu/">Bayproxy Eu</a> </li> <li> <a href="https://bilderupload.net/how-to-setup-socks5-proxy-firefox/">How To Setup Socks5 Proxy Firefox</a> </li> <li> <a href="https://bilderupload.net/nsi-proxy-service-driver/">Nsi Proxy Service Driver</a> </li> <li> <a href="https://bilderupload.net/how-to-unblock-a-contact-on-skype/">How To Unblock A Contact On Skype</a> </li> <li> <a href="https://bilderupload.net/us-ip-address-list/">Us Ip Address List</a> </li> <li> <a href="https://bilderupload.net/my-ip-shopify/">My Ip Shopify</a> </li> <li> <a href="https://bilderupload.net/how-to-use-vpn-for-cheaper-flights/">How To Use Vpn For Cheaper Flights</a> </li> <li> <a href="https://bilderupload.net/how-to-make-bit-torrent-faster/">How To Make Bit Torrent Faster</a> </li> <li> <a href="https://bilderupload.net/vpn-vs/">Vpn Vs</a> </li> <li> <a href="https://bilderupload.net/craigslist-menu/">Craigslist Menu</a> </li> <li> <a href="https://bilderupload.net/how-to-set-up-multiple-instagram-accounts/">How To Set Up Multiple Instagram Accounts</a> </li> <li> <a href="https://bilderupload.net/facebook-unblocked-at-school-site/">Facebook Unblocked At School Site</a> </li> <li> <a href="https://bilderupload.net/youtube-at-school-proxy/">Youtube At School Proxy</a> </li> <li> <a href="https://bilderupload.net/what-does-availability-mean-in-utorrent/">What Does Availability Mean In Utorrent</a> </li> <li> <a href="https://bilderupload.net/requests-python-example/">Requests Python Example</a> </li> <li> <a href="https://bilderupload.net/can-a-vpn-get-around-net-neutrality/">Can A Vpn Get Around Net Neutrality</a> </li> <li> <a href="https://bilderupload.net/xml-parse-python/">Xml Parse Python</a> </li> </ul> </section><section id="tag_cloud-2" class="widget widget_tag_cloud"><h2 class="widget-title">Tags</h2><div class="tagcloud"><a href="https://bilderupload.net/tag/1-million-proxy-list/" class="tag-cloud-link tag-link-945 tag-link-position-1" style="font-size: 8pt;" aria-label="1 million proxy list (27 items)">1 million proxy list</a> <a href="https://bilderupload.net/tag/best-free-proxy/" class="tag-cloud-link tag-link-580 tag-link-position-2" style="font-size: 19.666666666667pt;" aria-label="best free proxy (99 items)">best free proxy</a> <a href="https://bilderupload.net/tag/best-free-proxy-server-list/" class="tag-cloud-link tag-link-947 tag-link-position-3" style="font-size: 16.909090909091pt;" aria-label="best free proxy server list (73 items)">best free proxy server list</a> <a href="https://bilderupload.net/tag/best-proxy-server/" class="tag-cloud-link tag-link-579 tag-link-position-4" style="font-size: 11.181818181818pt;" aria-label="best proxy server (39 items)">best proxy server</a> <a href="https://bilderupload.net/tag/craigslist-account-for-sale/" class="tag-cloud-link tag-link-1376 tag-link-position-5" style="font-size: 8pt;" aria-label="craigslist account for sale (27 items)">craigslist account for sale</a> <a href="https://bilderupload.net/tag/craigslist-homepage/" class="tag-cloud-link tag-link-1375 tag-link-position-6" style="font-size: 10.969696969697pt;" aria-label="craigslist homepage (38 items)">craigslist homepage</a> <a href="https://bilderupload.net/tag/fastest-proxy-list/" class="tag-cloud-link tag-link-1503 tag-link-position-7" style="font-size: 8pt;" aria-label="fastest proxy list (27 items)">fastest proxy list</a> <a href="https://bilderupload.net/tag/free-proxy/" class="tag-cloud-link tag-link-435 tag-link-position-8" style="font-size: 12.030303030303pt;" aria-label="free proxy (43 items)">free proxy</a> <a href="https://bilderupload.net/tag/free-proxy-list/" class="tag-cloud-link tag-link-219 tag-link-position-9" style="font-size: 21.151515151515pt;" aria-label="free proxy list (116 items)">free proxy list</a> <a href="https://bilderupload.net/tag/free-proxy-list-download/" class="tag-cloud-link tag-link-1505 tag-link-position-10" style="font-size: 10.545454545455pt;" aria-label="free proxy list download (36 items)">free proxy list download</a> <a href="https://bilderupload.net/tag/free-proxy-list-india/" class="tag-cloud-link tag-link-3581 tag-link-position-11" style="font-size: 8.2121212121212pt;" aria-label="free proxy list india (28 items)">free proxy list india</a> <a href="https://bilderupload.net/tag/free-proxy-list-txt/" class="tag-cloud-link tag-link-1504 tag-link-position-12" style="font-size: 11.818181818182pt;" aria-label="free proxy list txt (42 items)">free proxy list txt</a> <a href="https://bilderupload.net/tag/free-proxy-list-usa/" class="tag-cloud-link tag-link-2648 tag-link-position-13" style="font-size: 8.2121212121212pt;" aria-label="free proxy list usa (28 items)">free proxy list usa</a> <a href="https://bilderupload.net/tag/free-proxy-server/" class="tag-cloud-link tag-link-136 tag-link-position-14" style="font-size: 13.090909090909pt;" aria-label="free proxy server (48 items)">free proxy server</a> <a href="https://bilderupload.net/tag/free-proxy-server-list/" class="tag-cloud-link tag-link-449 tag-link-position-15" style="font-size: 17.545454545455pt;" aria-label="free proxy server list (79 items)">free proxy server list</a> <a href="https://bilderupload.net/tag/free-socks-list-daily/" class="tag-cloud-link tag-link-949 tag-link-position-16" style="font-size: 11.818181818182pt;" aria-label="free socks list daily (42 items)">free socks list daily</a> <a href="https://bilderupload.net/tag/free-vpn-for-netflix/" class="tag-cloud-link tag-link-384 tag-link-position-17" style="font-size: 10.333333333333pt;" aria-label="free vpn for netflix (35 items)">free vpn for netflix</a> <a href="https://bilderupload.net/tag/free-vpn-to-hide-ip-address/" class="tag-cloud-link tag-link-14 tag-link-position-18" style="font-size: 15.212121212121pt;" aria-label="free vpn to hide ip address (60 items)">free vpn to hide ip address</a> <a href="https://bilderupload.net/tag/free-web-proxy/" class="tag-cloud-link tag-link-885 tag-link-position-19" style="font-size: 8.8484848484848pt;" aria-label="free web proxy (30 items)">free web proxy</a> <a href="https://bilderupload.net/tag/fresh-unblocked-proxy-sites-2020/" class="tag-cloud-link tag-link-1618 tag-link-position-20" style="font-size: 9.4848484848485pt;" aria-label="fresh unblocked proxy sites 2020 (32 items)">fresh unblocked proxy sites 2020</a> <a href="https://bilderupload.net/tag/hide-my-ip-address-free/" class="tag-cloud-link tag-link-11 tag-link-position-21" style="font-size: 13.727272727273pt;" aria-label="hide my ip address free (51 items)">hide my ip address free</a> <a href="https://bilderupload.net/tag/hide-my-ip-address-free-online/" class="tag-cloud-link tag-link-13 tag-link-position-22" style="font-size: 15.212121212121pt;" aria-label="hide my ip address free online (60 items)">hide my ip address free online</a> <a href="https://bilderupload.net/tag/hide-my-ip-online/" class="tag-cloud-link tag-link-1569 tag-link-position-23" style="font-size: 10.545454545455pt;" aria-label="hide my ip online (36 items)">hide my ip online</a> <a href="https://bilderupload.net/tag/how-to-hide-ip-address-on-android/" class="tag-cloud-link tag-link-1568 tag-link-position-24" style="font-size: 9.9090909090909pt;" aria-label="how to hide ip address on android (34 items)">how to hide ip address on android</a> <a href="https://bilderupload.net/tag/how-to-hide-my-ip-address-in-gmail/" class="tag-cloud-link tag-link-15 tag-link-position-25" style="font-size: 10.969696969697pt;" aria-label="how to hide my ip address in gmail (38 items)">how to hide my ip address in gmail</a> <a href="https://bilderupload.net/tag/how-to-hide-my-ip-address-without-vpn/" class="tag-cloud-link tag-link-12 tag-link-position-26" style="font-size: 14.787878787879pt;" aria-label="how to hide my ip address without vpn (58 items)">how to hide my ip address without vpn</a> <a href="https://bilderupload.net/tag/ip-address-tracker/" class="tag-cloud-link tag-link-2387 tag-link-position-27" style="font-size: 12.454545454545pt;" aria-label="ip address tracker (45 items)">ip address tracker</a> <a href="https://bilderupload.net/tag/my-ip-country/" class="tag-cloud-link tag-link-3401 tag-link-position-28" style="font-size: 10.757575757576pt;" aria-label="my ip country (37 items)">my ip country</a> <a href="https://bilderupload.net/tag/proxy-browser/" class="tag-cloud-link tag-link-577 tag-link-position-29" style="font-size: 11.606060606061pt;" aria-label="proxy browser (41 items)">proxy browser</a> <a href="https://bilderupload.net/tag/proxy-free/" class="tag-cloud-link tag-link-506 tag-link-position-30" style="font-size: 9.4848484848485pt;" aria-label="proxy free (32 items)">proxy free</a> <a href="https://bilderupload.net/tag/proxy-server/" class="tag-cloud-link tag-link-439 tag-link-position-31" style="font-size: 16.69696969697pt;" aria-label="proxy server (72 items)">proxy server</a> <a href="https://bilderupload.net/tag/proxy-server-address/" class="tag-cloud-link tag-link-581 tag-link-position-32" style="font-size: 13.939393939394pt;" aria-label="proxy server address (53 items)">proxy server address</a> <a href="https://bilderupload.net/tag/proxy-site/" class="tag-cloud-link tag-link-504 tag-link-position-33" style="font-size: 13.30303030303pt;" aria-label="proxy site (49 items)">proxy site</a> <a href="https://bilderupload.net/tag/proxy-url-list/" class="tag-cloud-link tag-link-222 tag-link-position-34" style="font-size: 9.2727272727273pt;" aria-label="proxy url list (31 items)">proxy url list</a> <a href="https://bilderupload.net/tag/proxy-websites/" class="tag-cloud-link tag-link-857 tag-link-position-35" style="font-size: 15.636363636364pt;" aria-label="proxy websites (63 items)">proxy websites</a> <a href="https://bilderupload.net/tag/socks5-proxy-list/" class="tag-cloud-link tag-link-69 tag-link-position-36" style="font-size: 8.2121212121212pt;" aria-label="socks5 proxy list (28 items)">socks5 proxy list</a> <a href="https://bilderupload.net/tag/socks5-proxy-list-txt/" class="tag-cloud-link tag-link-445 tag-link-position-37" style="font-size: 11.181818181818pt;" aria-label="socks5 proxy list txt (39 items)">socks5 proxy list txt</a> <a href="https://bilderupload.net/tag/thepiratebay3-list/" class="tag-cloud-link tag-link-232 tag-link-position-38" style="font-size: 8.8484848484848pt;" aria-label="thepiratebay3 list (30 items)">thepiratebay3 list</a> <a href="https://bilderupload.net/tag/unblock-proxy-free/" class="tag-cloud-link tag-link-128 tag-link-position-39" style="font-size: 22pt;" aria-label="unblock proxy free (128 items)">unblock proxy free</a> <a href="https://bilderupload.net/tag/unblocktheship-proxy/" class="tag-cloud-link tag-link-235 tag-link-position-40" style="font-size: 8.2121212121212pt;" aria-label="unblocktheship proxy (28 items)">unblocktheship proxy</a> <a href="https://bilderupload.net/tag/utorrent-download/" class="tag-cloud-link tag-link-598 tag-link-position-41" style="font-size: 8.8484848484848pt;" aria-label="utorrent download (30 items)">utorrent download</a> <a href="https://bilderupload.net/tag/vpn-proxy/" class="tag-cloud-link tag-link-858 tag-link-position-42" style="font-size: 11.606060606061pt;" aria-label="vpn proxy (41 items)">vpn proxy</a> <a href="https://bilderupload.net/tag/what-is-a-proxy-server/" class="tag-cloud-link tag-link-3124 tag-link-position-43" style="font-size: 8.2121212121212pt;" aria-label="what is a proxy server (28 items)">what is a proxy server</a> <a href="https://bilderupload.net/tag/what-is-my-ip/" class="tag-cloud-link tag-link-18 tag-link-position-44" style="font-size: 8.6363636363636pt;" aria-label="what is my ip (29 items)">what is my ip</a> <a href="https://bilderupload.net/tag/what-is-my-private-ip/" class="tag-cloud-link tag-link-3403 tag-link-position-45" style="font-size: 9.6969696969697pt;" aria-label="what is my private ip (33 items)">what is my private ip</a></div> </section></aside><!-- #secondary --> </div><!-- .container --> </div><!-- #content --> <footer id="colophon" class="site-footer"> <div id="footer-widgets" class="container"> <aside class="widget-area" role="complementary" aria-label="Footer"> <div class="widget-column footer-widget-3"> <section id="custom_html-2" class="widget_text widget widget_custom_html"><div class="textwidget custom-html-widget"><!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-S1X842WQMK"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-S1X842WQMK'); </script></div></section> </div> </aside><!-- .widget-area --> </div><!-- .container --> <div class="site-info"> <div class="container"> Theme Blog Tales by <a target="_blank" rel="designer" href="https://kantipurthemes.com/">Kantipur Themes</a> </div><!-- .container --> </div><!-- .site-info --> </footer><!-- #colophon --> <a href="#page" class="to-top"></a> </div><!-- #page --> <!--noptimize--><script>!function(){window.advanced_ads_ready_queue=window.advanced_ads_ready_queue||[],advanced_ads_ready_queue.push=window.advanced_ads_ready;for(var d=0,a=advanced_ads_ready_queue.length;d<a;d++)advanced_ads_ready(advanced_ads_ready_queue[d])}();</script><!--/noptimize--><svg style="position: absolute; width: 0; height: 0; overflow: hidden;" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink"> <defs> <symbol id="icon-behance" viewBox="0 0 37 32"> <path class="path1" d="M33 6.054h-9.125v2.214h9.125v-2.214zM28.5 13.661q-1.607 0-2.607 0.938t-1.107 2.545h7.286q-0.321-3.482-3.571-3.482zM28.786 24.107q1.125 0 2.179-0.571t1.357-1.554h3.946q-1.786 5.482-7.625 5.482-3.821 0-6.080-2.357t-2.259-6.196q0-3.714 2.33-6.17t6.009-2.455q2.464 0 4.295 1.214t2.732 3.196 0.902 4.429q0 0.304-0.036 0.839h-11.75q0 1.982 1.027 3.063t2.973 1.080zM4.946 23.214h5.286q3.661 0 3.661-2.982 0-3.214-3.554-3.214h-5.393v6.196zM4.946 13.625h5.018q1.393 0 2.205-0.652t0.813-2.027q0-2.571-3.393-2.571h-4.643v5.25zM0 4.536h10.607q1.554 0 2.768 0.25t2.259 0.848 1.607 1.723 0.563 2.75q0 3.232-3.071 4.696 2.036 0.571 3.071 2.054t1.036 3.643q0 1.339-0.438 2.438t-1.179 1.848-1.759 1.268-2.161 0.75-2.393 0.232h-10.911v-22.5z"></path> </symbol> <symbol id="icon-deviantart" viewBox="0 0 18 32"> <path class="path1" d="M18.286 5.411l-5.411 10.393 0.429 0.554h4.982v7.411h-9.054l-0.786 0.536-2.536 4.875-0.536 0.536h-5.375v-5.411l5.411-10.411-0.429-0.536h-4.982v-7.411h9.054l0.786-0.536 2.536-4.875 0.536-0.536h5.375v5.411z"></path> </symbol> <symbol id="icon-medium" viewBox="0 0 32 32"> <path class="path1" d="M10.661 7.518v20.946q0 0.446-0.223 0.759t-0.652 0.313q-0.304 0-0.589-0.143l-8.304-4.161q-0.375-0.179-0.634-0.598t-0.259-0.83v-20.357q0-0.357 0.179-0.607t0.518-0.25q0.25 0 0.786 0.268l9.125 4.571q0.054 0.054 0.054 0.089zM11.804 9.321l9.536 15.464-9.536-4.75v-10.714zM32 9.643v18.821q0 0.446-0.25 0.723t-0.679 0.277-0.839-0.232l-7.875-3.929zM31.946 7.5q0 0.054-4.58 7.491t-5.366 8.705l-6.964-11.321 5.786-9.411q0.304-0.5 0.929-0.5 0.25 0 0.464 0.107l9.661 4.821q0.071 0.036 0.071 0.107z"></path> </symbol> <symbol id="icon-slideshare" viewBox="0 0 32 32"> <path class="path1" d="M15.589 13.214q0 1.482-1.134 2.545t-2.723 1.063-2.723-1.063-1.134-2.545q0-1.5 1.134-2.554t2.723-1.054 2.723 1.054 1.134 2.554zM24.554 13.214q0 1.482-1.125 2.545t-2.732 1.063q-1.589 0-2.723-1.063t-1.134-2.545q0-1.5 1.134-2.554t2.723-1.054q1.607 0 2.732 1.054t1.125 2.554zM28.571 16.429v-11.911q0-1.554-0.571-2.205t-1.982-0.652h-19.857q-1.482 0-2.009 0.607t-0.527 2.25v12.018q0.768 0.411 1.58 0.714t1.446 0.5 1.446 0.33 1.268 0.196 1.25 0.071 1.045 0.009 1.009-0.036 0.795-0.036q1.214-0.018 1.696 0.482 0.107 0.107 0.179 0.161 0.464 0.446 1.089 0.911 0.125-1.625 2.107-1.554 0.089 0 0.652 0.027t0.768 0.036 0.813 0.018 0.946-0.018 0.973-0.080 1.089-0.152 1.107-0.241 1.196-0.348 1.205-0.482 1.286-0.616zM31.482 16.339q-2.161 2.661-6.643 4.5 1.5 5.089-0.411 8.304-1.179 2.018-3.268 2.643-1.857 0.571-3.25-0.268-1.536-0.911-1.464-2.929l-0.018-5.821v-0.018q-0.143-0.036-0.438-0.107t-0.42-0.089l-0.018 6.036q0.071 2.036-1.482 2.929-1.411 0.839-3.268 0.268-2.089-0.643-3.25-2.679-1.875-3.214-0.393-8.268-4.482-1.839-6.643-4.5-0.446-0.661-0.071-1.125t1.071 0.018q0.054 0.036 0.196 0.125t0.196 0.143v-12.393q0-1.286 0.839-2.196t2.036-0.911h22.446q1.196 0 2.036 0.911t0.839 2.196v12.393l0.375-0.268q0.696-0.482 1.071-0.018t-0.071 1.125z"></path> </symbol> <symbol id="icon-snapchat-ghost" viewBox="0 0 30 32"> <path class="path1" d="M15.143 2.286q2.393-0.018 4.295 1.223t2.92 3.438q0.482 1.036 0.482 3.196 0 0.839-0.161 3.411 0.25 0.125 0.5 0.125 0.321 0 0.911-0.241t0.911-0.241q0.518 0 1 0.321t0.482 0.821q0 0.571-0.563 0.964t-1.232 0.563-1.232 0.518-0.563 0.848q0 0.268 0.214 0.768 0.661 1.464 1.83 2.679t2.58 1.804q0.5 0.214 1.429 0.411 0.5 0.107 0.5 0.625 0 1.25-3.911 1.839-0.125 0.196-0.196 0.696t-0.25 0.83-0.589 0.33q-0.357 0-1.107-0.116t-1.143-0.116q-0.661 0-1.107 0.089-0.571 0.089-1.125 0.402t-1.036 0.679-1.036 0.723-1.357 0.598-1.768 0.241q-0.929 0-1.723-0.241t-1.339-0.598-1.027-0.723-1.036-0.679-1.107-0.402q-0.464-0.089-1.125-0.089-0.429 0-1.17 0.134t-1.045 0.134q-0.446 0-0.625-0.33t-0.25-0.848-0.196-0.714q-3.911-0.589-3.911-1.839 0-0.518 0.5-0.625 0.929-0.196 1.429-0.411 1.393-0.571 2.58-1.804t1.83-2.679q0.214-0.5 0.214-0.768 0-0.5-0.563-0.848t-1.241-0.527-1.241-0.563-0.563-0.938q0-0.482 0.464-0.813t0.982-0.33q0.268 0 0.857 0.232t0.946 0.232q0.321 0 0.571-0.125-0.161-2.536-0.161-3.393 0-2.179 0.482-3.214 1.143-2.446 3.071-3.536t4.714-1.125z"></path> </symbol> <symbol id="icon-yelp" viewBox="0 0 27 32"> <path class="path1" d="M13.804 23.554v2.268q-0.018 5.214-0.107 5.446-0.214 0.571-0.911 0.714-0.964 0.161-3.241-0.679t-2.902-1.589q-0.232-0.268-0.304-0.643-0.018-0.214 0.071-0.464 0.071-0.179 0.607-0.839t3.232-3.857q0.018 0 1.071-1.25 0.268-0.339 0.705-0.438t0.884 0.063q0.429 0.179 0.67 0.518t0.223 0.75zM11.143 19.071q-0.054 0.982-0.929 1.25l-2.143 0.696q-4.911 1.571-5.214 1.571-0.625-0.036-0.964-0.643-0.214-0.446-0.304-1.339-0.143-1.357 0.018-2.973t0.536-2.223 1-0.571q0.232 0 3.607 1.375 1.25 0.518 2.054 0.839l1.5 0.607q0.411 0.161 0.634 0.545t0.205 0.866zM25.893 24.375q-0.125 0.964-1.634 2.875t-2.42 2.268q-0.661 0.25-1.125-0.125-0.25-0.179-3.286-5.125l-0.839-1.375q-0.25-0.375-0.205-0.821t0.348-0.821q0.625-0.768 1.482-0.464 0.018 0.018 2.125 0.714 3.625 1.179 4.321 1.42t0.839 0.366q0.5 0.393 0.393 1.089zM13.893 13.089q0.089 1.821-0.964 2.179-1.036 0.304-2.036-1.268l-6.75-10.679q-0.143-0.625 0.339-1.107 0.732-0.768 3.705-1.598t4.009-0.563q0.714 0.179 0.875 0.804 0.054 0.321 0.393 5.455t0.429 6.777zM25.714 15.018q0.054 0.696-0.464 1.054-0.268 0.179-5.875 1.536-1.196 0.268-1.625 0.411l0.018-0.036q-0.411 0.107-0.821-0.071t-0.661-0.571q-0.536-0.839 0-1.554 0.018-0.018 1.339-1.821 2.232-3.054 2.679-3.643t0.607-0.696q0.5-0.339 1.161-0.036 0.857 0.411 2.196 2.384t1.446 2.991v0.054z"></path> </symbol> <symbol id="icon-vine" viewBox="0 0 27 32"> <path class="path1" d="M26.732 14.768v3.536q-1.804 0.411-3.536 0.411-1.161 2.429-2.955 4.839t-3.241 3.848-2.286 1.902q-1.429 0.804-2.893-0.054-0.5-0.304-1.080-0.777t-1.518-1.491-1.83-2.295-1.92-3.286-1.884-4.357-1.634-5.616-1.259-6.964h5.054q0.464 3.893 1.25 7.116t1.866 5.661 2.17 4.205 2.5 3.482q3.018-3.018 5.125-7.25-2.536-1.286-3.982-3.929t-1.446-5.946q0-3.429 1.857-5.616t5.071-2.188q3.179 0 4.875 1.884t1.696 5.313q0 2.839-1.036 5.107-0.125 0.018-0.348 0.054t-0.821 0.036-1.125-0.107-1.107-0.455-0.902-0.92q0.554-1.839 0.554-3.286 0-1.554-0.518-2.357t-1.411-0.804q-0.946 0-1.518 0.884t-0.571 2.509q0 3.321 1.875 5.241t4.768 1.92q1.107 0 2.161-0.25z"></path> </symbol> <symbol id="icon-vk" viewBox="0 0 35 32"> <path class="path1" d="M34.232 9.286q0.411 1.143-2.679 5.25-0.429 0.571-1.161 1.518-1.393 1.786-1.607 2.339-0.304 0.732 0.25 1.446 0.304 0.375 1.446 1.464h0.018l0.071 0.071q2.518 2.339 3.411 3.946 0.054 0.089 0.116 0.223t0.125 0.473-0.009 0.607-0.446 0.491-1.054 0.223l-4.571 0.071q-0.429 0.089-1-0.089t-0.929-0.393l-0.357-0.214q-0.536-0.375-1.25-1.143t-1.223-1.384-1.089-1.036-1.009-0.277q-0.054 0.018-0.143 0.063t-0.304 0.259-0.384 0.527-0.304 0.929-0.116 1.384q0 0.268-0.063 0.491t-0.134 0.33l-0.071 0.089q-0.321 0.339-0.946 0.393h-2.054q-1.268 0.071-2.607-0.295t-2.348-0.946-1.839-1.179-1.259-1.027l-0.446-0.429q-0.179-0.179-0.491-0.536t-1.277-1.625-1.893-2.696-2.188-3.768-2.33-4.857q-0.107-0.286-0.107-0.482t0.054-0.286l0.071-0.107q0.268-0.339 1.018-0.339l4.893-0.036q0.214 0.036 0.411 0.116t0.286 0.152l0.089 0.054q0.286 0.196 0.429 0.571 0.357 0.893 0.821 1.848t0.732 1.455l0.286 0.518q0.518 1.071 1 1.857t0.866 1.223 0.741 0.688 0.607 0.25 0.482-0.089q0.036-0.018 0.089-0.089t0.214-0.393 0.241-0.839 0.17-1.446 0-2.232q-0.036-0.714-0.161-1.304t-0.25-0.821l-0.107-0.214q-0.446-0.607-1.518-0.768-0.232-0.036 0.089-0.429 0.304-0.339 0.679-0.536 0.946-0.464 4.268-0.429 1.464 0.018 2.411 0.232 0.357 0.089 0.598 0.241t0.366 0.429 0.188 0.571 0.063 0.813-0.018 0.982-0.045 1.259-0.027 1.473q0 0.196-0.018 0.75t-0.009 0.857 0.063 0.723 0.205 0.696 0.402 0.438q0.143 0.036 0.304 0.071t0.464-0.196 0.679-0.616 0.929-1.196 1.214-1.92q1.071-1.857 1.911-4.018 0.071-0.179 0.179-0.313t0.196-0.188l0.071-0.054 0.089-0.045t0.232-0.054 0.357-0.009l5.143-0.036q0.696-0.089 1.143 0.045t0.554 0.295z"></path> </symbol> <symbol id="icon-search" viewBox="0 0 30 32"> <path class="path1" d="M20.571 14.857q0-3.304-2.348-5.652t-5.652-2.348-5.652 2.348-2.348 5.652 2.348 5.652 5.652 2.348 5.652-2.348 2.348-5.652zM29.714 29.714q0 0.929-0.679 1.607t-1.607 0.679q-0.964 0-1.607-0.679l-6.125-6.107q-3.196 2.214-7.125 2.214-2.554 0-4.884-0.991t-4.018-2.679-2.679-4.018-0.991-4.884 0.991-4.884 2.679-4.018 4.018-2.679 4.884-0.991 4.884 0.991 4.018 2.679 2.679 4.018 0.991 4.884q0 3.929-2.214 7.125l6.125 6.125q0.661 0.661 0.661 1.607z"></path> </symbol> <symbol id="icon-envelope-o" viewBox="0 0 32 32"> <path class="path1" d="M29.714 26.857v-13.714q-0.571 0.643-1.232 1.179-4.786 3.679-7.607 6.036-0.911 0.768-1.482 1.196t-1.545 0.866-1.83 0.438h-0.036q-0.857 0-1.83-0.438t-1.545-0.866-1.482-1.196q-2.821-2.357-7.607-6.036-0.661-0.536-1.232-1.179v13.714q0 0.232 0.17 0.402t0.402 0.17h26.286q0.232 0 0.402-0.17t0.17-0.402zM29.714 8.089v-0.438t-0.009-0.232-0.054-0.223-0.098-0.161-0.161-0.134-0.25-0.045h-26.286q-0.232 0-0.402 0.17t-0.17 0.402q0 3 2.625 5.071 3.446 2.714 7.161 5.661 0.107 0.089 0.625 0.527t0.821 0.67 0.795 0.563 0.902 0.491 0.768 0.161h0.036q0.357 0 0.768-0.161t0.902-0.491 0.795-0.563 0.821-0.67 0.625-0.527q3.714-2.946 7.161-5.661 0.964-0.768 1.795-2.063t0.83-2.348zM32 7.429v19.429q0 1.179-0.839 2.018t-2.018 0.839h-26.286q-1.179 0-2.018-0.839t-0.839-2.018v-19.429q0-1.179 0.839-2.018t2.018-0.839h26.286q1.179 0 2.018 0.839t0.839 2.018z"></path> </symbol> <symbol id="icon-close" viewBox="0 0 25 32"> <path class="path1" d="M23.179 23.607q0 0.714-0.5 1.214l-2.429 2.429q-0.5 0.5-1.214 0.5t-1.214-0.5l-5.25-5.25-5.25 5.25q-0.5 0.5-1.214 0.5t-1.214-0.5l-2.429-2.429q-0.5-0.5-0.5-1.214t0.5-1.214l5.25-5.25-5.25-5.25q-0.5-0.5-0.5-1.214t0.5-1.214l2.429-2.429q0.5-0.5 1.214-0.5t1.214 0.5l5.25 5.25 5.25-5.25q0.5-0.5 1.214-0.5t1.214 0.5l2.429 2.429q0.5 0.5 0.5 1.214t-0.5 1.214l-5.25 5.25 5.25 5.25q0.5 0.5 0.5 1.214z"></path> </symbol> <symbol id="icon-angle-down" viewBox="0 0 21 32"> <path class="path1" d="M19.196 13.143q0 0.232-0.179 0.411l-8.321 8.321q-0.179 0.179-0.411 0.179t-0.411-0.179l-8.321-8.321q-0.179-0.179-0.179-0.411t0.179-0.411l0.893-0.893q0.179-0.179 0.411-0.179t0.411 0.179l7.018 7.018 7.018-7.018q0.179-0.179 0.411-0.179t0.411 0.179l0.893 0.893q0.179 0.179 0.179 0.411z"></path> </symbol> <symbol id="icon-folder-open" viewBox="0 0 34 32"> <path class="path1" d="M33.554 17q0 0.554-0.554 1.179l-6 7.071q-0.768 0.911-2.152 1.545t-2.563 0.634h-19.429q-0.607 0-1.080-0.232t-0.473-0.768q0-0.554 0.554-1.179l6-7.071q0.768-0.911 2.152-1.545t2.563-0.634h19.429q0.607 0 1.080 0.232t0.473 0.768zM27.429 10.857v2.857h-14.857q-1.679 0-3.518 0.848t-2.929 2.134l-6.107 7.179q0-0.071-0.009-0.223t-0.009-0.223v-17.143q0-1.643 1.179-2.821t2.821-1.179h5.714q1.643 0 2.821 1.179t1.179 2.821v0.571h9.714q1.643 0 2.821 1.179t1.179 2.821z"></path> </symbol> <symbol id="icon-twitter" viewBox="0 0 30 32"> <path class="path1" d="M28.929 7.286q-1.196 1.75-2.893 2.982 0.018 0.25 0.018 0.75 0 2.321-0.679 4.634t-2.063 4.437-3.295 3.759-4.607 2.607-5.768 0.973q-4.839 0-8.857-2.589 0.625 0.071 1.393 0.071 4.018 0 7.161-2.464-1.875-0.036-3.357-1.152t-2.036-2.848q0.589 0.089 1.089 0.089 0.768 0 1.518-0.196-2-0.411-3.313-1.991t-1.313-3.67v-0.071q1.214 0.679 2.607 0.732-1.179-0.786-1.875-2.054t-0.696-2.75q0-1.571 0.786-2.911 2.161 2.661 5.259 4.259t6.634 1.777q-0.143-0.679-0.143-1.321 0-2.393 1.688-4.080t4.080-1.688q2.5 0 4.214 1.821 1.946-0.375 3.661-1.393-0.661 2.054-2.536 3.179 1.661-0.179 3.321-0.893z"></path> </symbol> <symbol id="icon-facebook" viewBox="0 0 19 32"> <path class="path1" d="M17.125 0.214v4.714h-2.804q-1.536 0-2.071 0.643t-0.536 1.929v3.375h5.232l-0.696 5.286h-4.536v13.554h-5.464v-13.554h-4.554v-5.286h4.554v-3.893q0-3.321 1.857-5.152t4.946-1.83q2.625 0 4.071 0.214z"></path> </symbol> <symbol id="icon-github" viewBox="0 0 27 32"> <path class="path1" d="M13.714 2.286q3.732 0 6.884 1.839t4.991 4.991 1.839 6.884q0 4.482-2.616 8.063t-6.759 4.955q-0.482 0.089-0.714-0.125t-0.232-0.536q0-0.054 0.009-1.366t0.009-2.402q0-1.732-0.929-2.536 1.018-0.107 1.83-0.321t1.679-0.696 1.446-1.188 0.946-1.875 0.366-2.688q0-2.125-1.411-3.679 0.661-1.625-0.143-3.643-0.5-0.161-1.446 0.196t-1.643 0.786l-0.679 0.429q-1.661-0.464-3.429-0.464t-3.429 0.464q-0.286-0.196-0.759-0.482t-1.491-0.688-1.518-0.241q-0.804 2.018-0.143 3.643-1.411 1.554-1.411 3.679 0 1.518 0.366 2.679t0.938 1.875 1.438 1.196 1.679 0.696 1.83 0.321q-0.696 0.643-0.875 1.839-0.375 0.179-0.804 0.268t-1.018 0.089-1.17-0.384-0.991-1.116q-0.339-0.571-0.866-0.929t-0.884-0.429l-0.357-0.054q-0.375 0-0.518 0.080t-0.089 0.205 0.161 0.25 0.232 0.214l0.125 0.089q0.393 0.179 0.777 0.679t0.563 0.911l0.179 0.411q0.232 0.679 0.786 1.098t1.196 0.536 1.241 0.125 0.991-0.063l0.411-0.071q0 0.679 0.009 1.58t0.009 0.973q0 0.321-0.232 0.536t-0.714 0.125q-4.143-1.375-6.759-4.955t-2.616-8.063q0-3.732 1.839-6.884t4.991-4.991 6.884-1.839zM5.196 21.982q0.054-0.125-0.125-0.214-0.179-0.054-0.232 0.036-0.054 0.125 0.125 0.214 0.161 0.107 0.232-0.036zM5.75 22.589q0.125-0.089-0.036-0.286-0.179-0.161-0.286-0.054-0.125 0.089 0.036 0.286 0.179 0.179 0.286 0.054zM6.286 23.393q0.161-0.125 0-0.339-0.143-0.232-0.304-0.107-0.161 0.089 0 0.321t0.304 0.125zM7.036 24.143q0.143-0.143-0.071-0.339-0.214-0.214-0.357-0.054-0.161 0.143 0.071 0.339 0.214 0.214 0.357 0.054zM8.054 24.589q0.054-0.196-0.232-0.286-0.268-0.071-0.339 0.125t0.232 0.268q0.268 0.107 0.339-0.107zM9.179 24.679q0-0.232-0.304-0.196-0.286 0-0.286 0.196 0 0.232 0.304 0.196 0.286 0 0.286-0.196zM10.214 24.5q-0.036-0.196-0.321-0.161-0.286 0.054-0.25 0.268t0.321 0.143 0.25-0.25z"></path> </symbol> <symbol id="icon-bars" viewBox="0 0 27 32"> <path class="path1" d="M27.429 24v2.286q0 0.464-0.339 0.804t-0.804 0.339h-25.143q-0.464 0-0.804-0.339t-0.339-0.804v-2.286q0-0.464 0.339-0.804t0.804-0.339h25.143q0.464 0 0.804 0.339t0.339 0.804zM27.429 14.857v2.286q0 0.464-0.339 0.804t-0.804 0.339h-25.143q-0.464 0-0.804-0.339t-0.339-0.804v-2.286q0-0.464 0.339-0.804t0.804-0.339h25.143q0.464 0 0.804 0.339t0.339 0.804zM27.429 5.714v2.286q0 0.464-0.339 0.804t-0.804 0.339h-25.143q-0.464 0-0.804-0.339t-0.339-0.804v-2.286q0-0.464 0.339-0.804t0.804-0.339h25.143q0.464 0 0.804 0.339t0.339 0.804z"></path> </symbol> <symbol id="icon-google-plus" viewBox="0 0 41 32"> <path class="path1" d="M25.661 16.304q0 3.714-1.554 6.616t-4.429 4.536-6.589 1.634q-2.661 0-5.089-1.036t-4.179-2.786-2.786-4.179-1.036-5.089 1.036-5.089 2.786-4.179 4.179-2.786 5.089-1.036q5.107 0 8.768 3.429l-3.554 3.411q-2.089-2.018-5.214-2.018-2.196 0-4.063 1.107t-2.955 3.009-1.089 4.152 1.089 4.152 2.955 3.009 4.063 1.107q1.482 0 2.723-0.411t2.045-1.027 1.402-1.402 0.875-1.482 0.384-1.321h-7.429v-4.5h12.357q0.214 1.125 0.214 2.179zM41.143 14.125v3.75h-3.732v3.732h-3.75v-3.732h-3.732v-3.75h3.732v-3.732h3.75v3.732h3.732z"></path> </symbol> <symbol id="icon-linkedin" viewBox="0 0 27 32"> <path class="path1" d="M6.232 11.161v17.696h-5.893v-17.696h5.893zM6.607 5.696q0.018 1.304-0.902 2.179t-2.42 0.875h-0.036q-1.464 0-2.357-0.875t-0.893-2.179q0-1.321 0.92-2.188t2.402-0.866 2.375 0.866 0.911 2.188zM27.429 18.714v10.143h-5.875v-9.464q0-1.875-0.723-2.938t-2.259-1.063q-1.125 0-1.884 0.616t-1.134 1.527q-0.196 0.536-0.196 1.446v9.875h-5.875q0.036-7.125 0.036-11.554t-0.018-5.286l-0.018-0.857h5.875v2.571h-0.036q0.357-0.571 0.732-1t1.009-0.929 1.554-0.777 2.045-0.277q3.054 0 4.911 2.027t1.857 5.938z"></path> </symbol> <symbol id="icon-quote-right" viewBox="0 0 30 32"> <path class="path1" d="M13.714 5.714v12.571q0 1.857-0.723 3.545t-1.955 2.92-2.92 1.955-3.545 0.723h-1.143q-0.464 0-0.804-0.339t-0.339-0.804v-2.286q0-0.464 0.339-0.804t0.804-0.339h1.143q1.893 0 3.232-1.339t1.339-3.232v-0.571q0-0.714-0.5-1.214t-1.214-0.5h-4q-1.429 0-2.429-1t-1-2.429v-6.857q0-1.429 1-2.429t2.429-1h6.857q1.429 0 2.429 1t1 2.429zM29.714 5.714v12.571q0 1.857-0.723 3.545t-1.955 2.92-2.92 1.955-3.545 0.723h-1.143q-0.464 0-0.804-0.339t-0.339-0.804v-2.286q0-0.464 0.339-0.804t0.804-0.339h1.143q1.893 0 3.232-1.339t1.339-3.232v-0.571q0-0.714-0.5-1.214t-1.214-0.5h-4q-1.429 0-2.429-1t-1-2.429v-6.857q0-1.429 1-2.429t2.429-1h6.857q1.429 0 2.429 1t1 2.429z"></path> </symbol> <symbol id="icon-mail-reply" viewBox="0 0 32 32"> <path class="path1" d="M32 20q0 2.964-2.268 8.054-0.054 0.125-0.188 0.429t-0.241 0.536-0.232 0.393q-0.214 0.304-0.5 0.304-0.268 0-0.42-0.179t-0.152-0.446q0-0.161 0.045-0.473t0.045-0.42q0.089-1.214 0.089-2.196 0-1.804-0.313-3.232t-0.866-2.473-1.429-1.804-1.884-1.241-2.375-0.759-2.75-0.384-3.134-0.107h-4v4.571q0 0.464-0.339 0.804t-0.804 0.339-0.804-0.339l-9.143-9.143q-0.339-0.339-0.339-0.804t0.339-0.804l9.143-9.143q0.339-0.339 0.804-0.339t0.804 0.339 0.339 0.804v4.571h4q12.732 0 15.625 7.196 0.946 2.393 0.946 5.946z"></path> </symbol> <symbol id="icon-youtube" viewBox="0 0 27 32"> <path class="path1" d="M17.339 22.214v3.768q0 1.196-0.696 1.196-0.411 0-0.804-0.393v-5.375q0.393-0.393 0.804-0.393 0.696 0 0.696 1.196zM23.375 22.232v0.821h-1.607v-0.821q0-1.214 0.804-1.214t0.804 1.214zM6.125 18.339h1.911v-1.679h-5.571v1.679h1.875v10.161h1.786v-10.161zM11.268 28.5h1.589v-8.821h-1.589v6.75q-0.536 0.75-1.018 0.75-0.321 0-0.375-0.375-0.018-0.054-0.018-0.625v-6.5h-1.589v6.982q0 0.875 0.143 1.304 0.214 0.661 1.036 0.661 0.857 0 1.821-1.089v0.964zM18.929 25.857v-3.518q0-1.304-0.161-1.768-0.304-1-1.268-1-0.893 0-1.661 0.964v-3.875h-1.589v11.839h1.589v-0.857q0.804 0.982 1.661 0.982 0.964 0 1.268-0.982 0.161-0.482 0.161-1.786zM24.964 25.679v-0.232h-1.625q0 0.911-0.036 1.089-0.125 0.643-0.714 0.643-0.821 0-0.821-1.232v-1.554h3.196v-1.839q0-1.411-0.482-2.071-0.696-0.911-1.893-0.911-1.214 0-1.911 0.911-0.5 0.661-0.5 2.071v3.089q0 1.411 0.518 2.071 0.696 0.911 1.929 0.911 1.286 0 1.929-0.946 0.321-0.482 0.375-0.964 0.036-0.161 0.036-1.036zM14.107 9.375v-3.75q0-1.232-0.768-1.232t-0.768 1.232v3.75q0 1.25 0.768 1.25t0.768-1.25zM26.946 22.786q0 4.179-0.464 6.25-0.25 1.054-1.036 1.768t-1.821 0.821q-3.286 0.375-9.911 0.375t-9.911-0.375q-1.036-0.107-1.83-0.821t-1.027-1.768q-0.464-2-0.464-6.25 0-4.179 0.464-6.25 0.25-1.054 1.036-1.768t1.839-0.839q3.268-0.357 9.893-0.357t9.911 0.357q1.036 0.125 1.83 0.839t1.027 1.768q0.464 2 0.464 6.25zM9.125 0h1.821l-2.161 7.125v4.839h-1.786v-4.839q-0.25-1.321-1.089-3.786-0.661-1.839-1.161-3.339h1.893l1.268 4.696zM15.732 5.946v3.125q0 1.446-0.5 2.107-0.661 0.911-1.893 0.911-1.196 0-1.875-0.911-0.5-0.679-0.5-2.107v-3.125q0-1.429 0.5-2.089 0.679-0.911 1.875-0.911 1.232 0 1.893 0.911 0.5 0.661 0.5 2.089zM21.714 3.054v8.911h-1.625v-0.982q-0.946 1.107-1.839 1.107-0.821 0-1.054-0.661-0.143-0.429-0.143-1.339v-7.036h1.625v6.554q0 0.589 0.018 0.625 0.054 0.393 0.375 0.393 0.482 0 1.018-0.768v-6.804h1.625z"></path> </symbol> <symbol id="icon-dropbox" viewBox="0 0 32 32"> <path class="path1" d="M7.179 12.625l8.821 5.446-6.107 5.089-8.75-5.696zM24.786 22.536v1.929l-8.75 5.232v0.018l-0.018-0.018-0.018 0.018v-0.018l-8.732-5.232v-1.929l2.625 1.714 6.107-5.071v-0.036l0.018 0.018 0.018-0.018v0.036l6.125 5.071zM9.893 2.107l6.107 5.089-8.821 5.429-6.036-4.821zM24.821 12.625l6.036 4.839-8.732 5.696-6.125-5.089zM22.125 2.107l8.732 5.696-6.036 4.821-8.821-5.429z"></path> </symbol> <symbol id="icon-instagram" viewBox="0 0 27 32"> <path class="path1" d="M18.286 16q0-1.893-1.339-3.232t-3.232-1.339-3.232 1.339-1.339 3.232 1.339 3.232 3.232 1.339 3.232-1.339 1.339-3.232zM20.75 16q0 2.929-2.054 4.982t-4.982 2.054-4.982-2.054-2.054-4.982 2.054-4.982 4.982-2.054 4.982 2.054 2.054 4.982zM22.679 8.679q0 0.679-0.482 1.161t-1.161 0.482-1.161-0.482-0.482-1.161 0.482-1.161 1.161-0.482 1.161 0.482 0.482 1.161zM13.714 4.75q-0.125 0-1.366-0.009t-1.884 0-1.723 0.054-1.839 0.179-1.277 0.33q-0.893 0.357-1.571 1.036t-1.036 1.571q-0.196 0.518-0.33 1.277t-0.179 1.839-0.054 1.723 0 1.884 0.009 1.366-0.009 1.366 0 1.884 0.054 1.723 0.179 1.839 0.33 1.277q0.357 0.893 1.036 1.571t1.571 1.036q0.518 0.196 1.277 0.33t1.839 0.179 1.723 0.054 1.884 0 1.366-0.009 1.366 0.009 1.884 0 1.723-0.054 1.839-0.179 1.277-0.33q0.893-0.357 1.571-1.036t1.036-1.571q0.196-0.518 0.33-1.277t0.179-1.839 0.054-1.723 0-1.884-0.009-1.366 0.009-1.366 0-1.884-0.054-1.723-0.179-1.839-0.33-1.277q-0.357-0.893-1.036-1.571t-1.571-1.036q-0.518-0.196-1.277-0.33t-1.839-0.179-1.723-0.054-1.884 0-1.366 0.009zM27.429 16q0 4.089-0.089 5.661-0.179 3.714-2.214 5.75t-5.75 2.214q-1.571 0.089-5.661 0.089t-5.661-0.089q-3.714-0.179-5.75-2.214t-2.214-5.75q-0.089-1.571-0.089-5.661t0.089-5.661q0.179-3.714 2.214-5.75t5.75-2.214q1.571-0.089 5.661-0.089t5.661 0.089q3.714 0.179 5.75 2.214t2.214 5.75q0.089 1.571 0.089 5.661z"></path> </symbol> <symbol id="icon-flickr" viewBox="0 0 27 32"> <path class="path1" d="M22.286 2.286q2.125 0 3.634 1.509t1.509 3.634v17.143q0 2.125-1.509 3.634t-3.634 1.509h-17.143q-2.125 0-3.634-1.509t-1.509-3.634v-17.143q0-2.125 1.509-3.634t3.634-1.509h17.143zM12.464 16q0-1.571-1.107-2.679t-2.679-1.107-2.679 1.107-1.107 2.679 1.107 2.679 2.679 1.107 2.679-1.107 1.107-2.679zM22.536 16q0-1.571-1.107-2.679t-2.679-1.107-2.679 1.107-1.107 2.679 1.107 2.679 2.679 1.107 2.679-1.107 1.107-2.679z"></path> </symbol> <symbol id="icon-tumblr" viewBox="0 0 19 32"> <path class="path1" d="M16.857 23.732l1.429 4.232q-0.411 0.625-1.982 1.179t-3.161 0.571q-1.857 0.036-3.402-0.464t-2.545-1.321-1.696-1.893-0.991-2.143-0.295-2.107v-9.714h-3v-3.839q1.286-0.464 2.304-1.241t1.625-1.607 1.036-1.821 0.607-1.768 0.268-1.58q0.018-0.089 0.080-0.152t0.134-0.063h4.357v7.571h5.946v4.5h-5.964v9.25q0 0.536 0.116 1t0.402 0.938 0.884 0.741 1.455 0.25q1.393-0.036 2.393-0.518z"></path> </symbol> <symbol id="icon-dockerhub" viewBox="0 0 24 28"> <path class="path1" d="M1.597 10.257h2.911v2.83H1.597v-2.83zm3.573 0h2.91v2.83H5.17v-2.83zm0-3.627h2.91v2.829H5.17V6.63zm3.57 3.627h2.912v2.83H8.74v-2.83zm0-3.627h2.912v2.829H8.74V6.63zm3.573 3.627h2.911v2.83h-2.911v-2.83zm0-3.627h2.911v2.829h-2.911V6.63zm3.572 3.627h2.911v2.83h-2.911v-2.83zM12.313 3h2.911v2.83h-2.911V3zm-6.65 14.173c-.449 0-.812.354-.812.788 0 .435.364.788.812.788.447 0 .811-.353.811-.788 0-.434-.363-.788-.811-.788"></path> <path class="path2" d="M28.172 11.721c-.978-.549-2.278-.624-3.388-.306-.136-1.146-.91-2.149-1.83-2.869l-.366-.286-.307.345c-.618.692-.8 1.845-.718 2.73.063.651.273 1.312.685 1.834-.313.183-.668.328-.985.434-.646.212-1.347.33-2.028.33H.083l-.042.429c-.137 1.432.065 2.866.674 4.173l.262.519.03.048c1.8 2.973 4.963 4.225 8.41 4.225 6.672 0 12.174-2.896 14.702-9.015 1.689.085 3.417-.4 4.243-1.968l.211-.4-.401-.223zM5.664 19.458c-.85 0-1.542-.671-1.542-1.497 0-.825.691-1.498 1.541-1.498.849 0 1.54.672 1.54 1.497s-.69 1.498-1.539 1.498z"></path> </symbol> <symbol id="icon-dribbble" viewBox="0 0 27 32"> <path class="path1" d="M18.286 26.786q-0.75-4.304-2.5-8.893h-0.036l-0.036 0.018q-0.286 0.107-0.768 0.295t-1.804 0.875-2.446 1.464-2.339 2.045-1.839 2.643l-0.268-0.196q3.286 2.679 7.464 2.679 2.357 0 4.571-0.929zM14.982 15.946q-0.375-0.875-0.946-1.982-5.554 1.661-12.018 1.661-0.018 0.125-0.018 0.375 0 2.214 0.786 4.223t2.214 3.598q0.893-1.589 2.205-2.973t2.545-2.223 2.33-1.446 1.777-0.857l0.661-0.232q0.071-0.018 0.232-0.063t0.232-0.080zM13.071 12.161q-2.143-3.804-4.357-6.75-2.464 1.161-4.179 3.321t-2.286 4.857q5.393 0 10.821-1.429zM25.286 17.857q-3.75-1.071-7.304-0.518 1.554 4.268 2.286 8.375 1.982-1.339 3.304-3.384t1.714-4.473zM10.911 4.625q-0.018 0-0.036 0.018 0.018-0.018 0.036-0.018zM21.446 7.214q-3.304-2.929-7.732-2.929-1.357 0-2.768 0.339 2.339 3.036 4.393 6.821 1.232-0.464 2.321-1.080t1.723-1.098 1.17-1.018 0.67-0.723zM25.429 15.875q-0.054-4.143-2.661-7.321l-0.018 0.018q-0.161 0.214-0.339 0.438t-0.777 0.795-1.268 1.080-1.786 1.161-2.348 1.152q0.446 0.946 0.786 1.696 0.036 0.107 0.116 0.313t0.134 0.295q0.643-0.089 1.33-0.125t1.313-0.036 1.232 0.027 1.143 0.071 1.009 0.098 0.857 0.116 0.652 0.107 0.446 0.080zM27.429 16q0 3.732-1.839 6.884t-4.991 4.991-6.884 1.839-6.884-1.839-4.991-4.991-1.839-6.884 1.839-6.884 4.991-4.991 6.884-1.839 6.884 1.839 4.991 4.991 1.839 6.884z"></path> </symbol> <symbol id="icon-skype" viewBox="0 0 27 32"> <path class="path1" d="M20.946 18.982q0-0.893-0.348-1.634t-0.866-1.223-1.304-0.875-1.473-0.607-1.563-0.411l-1.857-0.429q-0.536-0.125-0.786-0.188t-0.625-0.205-0.536-0.286-0.295-0.375-0.134-0.536q0-1.375 2.571-1.375 0.768 0 1.375 0.214t0.964 0.509 0.679 0.598 0.714 0.518 0.857 0.214q0.839 0 1.348-0.571t0.509-1.375q0-0.982-1-1.777t-2.536-1.205-3.25-0.411q-1.214 0-2.357 0.277t-2.134 0.839-1.589 1.554-0.598 2.295q0 1.089 0.339 1.902t1 1.348 1.429 0.866 1.839 0.58l2.607 0.643q1.607 0.393 2 0.643 0.571 0.357 0.571 1.071 0 0.696-0.714 1.152t-1.875 0.455q-0.911 0-1.634-0.286t-1.161-0.688-0.813-0.804-0.821-0.688-0.964-0.286q-0.893 0-1.348 0.536t-0.455 1.339q0 1.643 2.179 2.813t5.196 1.17q1.304 0 2.5-0.33t2.188-0.955 1.58-1.67 0.589-2.348zM27.429 22.857q0 2.839-2.009 4.848t-4.848 2.009q-2.321 0-4.179-1.429-1.375 0.286-2.679 0.286-2.554 0-4.884-0.991t-4.018-2.679-2.679-4.018-0.991-4.884q0-1.304 0.286-2.679-1.429-1.857-1.429-4.179 0-2.839 2.009-4.848t4.848-2.009q2.321 0 4.179 1.429 1.375-0.286 2.679-0.286 2.554 0 4.884 0.991t4.018 2.679 2.679 4.018 0.991 4.884q0 1.304-0.286 2.679 1.429 1.857 1.429 4.179z"></path> </symbol> <symbol id="icon-foursquare" viewBox="0 0 23 32"> <path class="path1" d="M17.857 7.75l0.661-3.464q0.089-0.411-0.161-0.714t-0.625-0.304h-12.714q-0.411 0-0.688 0.304t-0.277 0.661v19.661q0 0.125 0.107 0.018l5.196-6.286q0.411-0.464 0.679-0.598t0.857-0.134h4.268q0.393 0 0.661-0.259t0.321-0.527q0.429-2.321 0.661-3.411 0.071-0.375-0.205-0.714t-0.652-0.339h-5.25q-0.518 0-0.857-0.339t-0.339-0.857v-0.75q0-0.518 0.339-0.848t0.857-0.33h6.179q0.321 0 0.625-0.241t0.357-0.527zM21.911 3.786q-0.268 1.304-0.955 4.759t-1.241 6.25-0.625 3.098q-0.107 0.393-0.161 0.58t-0.25 0.58-0.438 0.589-0.688 0.375-1.036 0.179h-4.839q-0.232 0-0.393 0.179-0.143 0.161-7.607 8.821-0.393 0.446-1.045 0.509t-0.866-0.098q-0.982-0.393-0.982-1.75v-25.179q0-0.982 0.679-1.83t2.143-0.848h15.857q1.696 0 2.268 0.946t0.179 2.839zM21.911 3.786l-2.821 14.107q0.071-0.304 0.625-3.098t1.241-6.25 0.955-4.759z"></path> </symbol> <symbol id="icon-wordpress" viewBox="0 0 32 32"> <path class="path1" d="M2.268 16q0-2.911 1.196-5.589l6.554 17.946q-3.5-1.696-5.625-5.018t-2.125-7.339zM25.268 15.304q0 0.339-0.045 0.688t-0.179 0.884-0.205 0.786-0.313 1.054-0.313 1.036l-1.357 4.571-4.964-14.75q0.821-0.054 1.571-0.143 0.339-0.036 0.464-0.33t-0.045-0.554-0.509-0.241l-3.661 0.179q-1.339-0.018-3.607-0.179-0.214-0.018-0.366 0.089t-0.205 0.268-0.027 0.33 0.161 0.295 0.348 0.143l1.429 0.143 2.143 5.857-3 9-5-14.857q0.821-0.054 1.571-0.143 0.339-0.036 0.464-0.33t-0.045-0.554-0.509-0.241l-3.661 0.179q-0.125 0-0.411-0.009t-0.464-0.009q1.875-2.857 4.902-4.527t6.563-1.67q2.625 0 5.009 0.946t4.259 2.661h-0.179q-0.982 0-1.643 0.723t-0.661 1.705q0 0.214 0.036 0.429t0.071 0.384 0.143 0.411 0.161 0.375 0.214 0.402 0.223 0.375 0.259 0.429 0.25 0.411q1.125 1.911 1.125 3.786zM16.232 17.196l4.232 11.554q0.018 0.107 0.089 0.196-2.25 0.786-4.554 0.786-2 0-3.875-0.571zM28.036 9.411q1.696 3.107 1.696 6.589 0 3.732-1.857 6.884t-4.982 4.973l4.196-12.107q1.054-3.018 1.054-4.929 0-0.75-0.107-1.411zM16 0q3.25 0 6.214 1.268t5.107 3.411 3.411 5.107 1.268 6.214-1.268 6.214-3.411 5.107-5.107 3.411-6.214 1.268-6.214-1.268-5.107-3.411-3.411-5.107-1.268-6.214 1.268-6.214 3.411-5.107 5.107-3.411 6.214-1.268zM16 31.268q3.089 0 5.92-1.214t4.875-3.259 3.259-4.875 1.214-5.92-1.214-5.92-3.259-4.875-4.875-3.259-5.92-1.214-5.92 1.214-4.875 3.259-3.259 4.875-1.214 5.92 1.214 5.92 3.259 4.875 4.875 3.259 5.92 1.214z"></path> </symbol> <symbol id="icon-stumbleupon" viewBox="0 0 34 32"> <path class="path1" d="M18.964 12.714v-2.107q0-0.75-0.536-1.286t-1.286-0.536-1.286 0.536-0.536 1.286v10.929q0 3.125-2.25 5.339t-5.411 2.214q-3.179 0-5.42-2.241t-2.241-5.42v-4.75h5.857v4.679q0 0.768 0.536 1.295t1.286 0.527 1.286-0.527 0.536-1.295v-11.071q0-3.054 2.259-5.214t5.384-2.161q3.143 0 5.393 2.179t2.25 5.25v2.429l-3.482 1.036zM28.429 16.679h5.857v4.75q0 3.179-2.241 5.42t-5.42 2.241q-3.161 0-5.411-2.223t-2.25-5.366v-4.786l2.339 1.089 3.482-1.036v4.821q0 0.75 0.536 1.277t1.286 0.527 1.286-0.527 0.536-1.277v-4.911z"></path> </symbol> <symbol id="icon-digg" viewBox="0 0 37 32"> <path class="path1" d="M5.857 5.036h3.643v17.554h-9.5v-12.446h5.857v-5.107zM5.857 19.661v-6.589h-2.196v6.589h2.196zM10.964 10.143v12.446h3.661v-12.446h-3.661zM10.964 5.036v3.643h3.661v-3.643h-3.661zM16.089 10.143h9.518v16.821h-9.518v-2.911h5.857v-1.464h-5.857v-12.446zM21.946 19.661v-6.589h-2.196v6.589h2.196zM27.071 10.143h9.5v16.821h-9.5v-2.911h5.839v-1.464h-5.839v-12.446zM32.911 19.661v-6.589h-2.196v6.589h2.196z"></path> </symbol> <symbol id="icon-spotify" viewBox="0 0 27 32"> <path class="path1" d="M20.125 21.607q0-0.571-0.536-0.911-3.446-2.054-7.982-2.054-2.375 0-5.125 0.607-0.75 0.161-0.75 0.929 0 0.357 0.241 0.616t0.634 0.259q0.089 0 0.661-0.143 2.357-0.482 4.339-0.482 4.036 0 7.089 1.839 0.339 0.196 0.589 0.196 0.339 0 0.589-0.241t0.25-0.616zM21.839 17.768q0-0.714-0.625-1.089-4.232-2.518-9.786-2.518-2.732 0-5.411 0.75-0.857 0.232-0.857 1.143 0 0.446 0.313 0.759t0.759 0.313q0.125 0 0.661-0.143 2.179-0.589 4.482-0.589 4.982 0 8.714 2.214 0.429 0.232 0.679 0.232 0.446 0 0.759-0.313t0.313-0.759zM23.768 13.339q0-0.839-0.714-1.25-2.25-1.304-5.232-1.973t-6.125-0.67q-3.643 0-6.5 0.839-0.411 0.125-0.688 0.455t-0.277 0.866q0 0.554 0.366 0.929t0.92 0.375q0.196 0 0.714-0.143 2.375-0.661 5.482-0.661 2.839 0 5.527 0.607t4.527 1.696q0.375 0.214 0.714 0.214 0.518 0 0.902-0.366t0.384-0.92zM27.429 16q0 3.732-1.839 6.884t-4.991 4.991-6.884 1.839-6.884-1.839-4.991-4.991-1.839-6.884 1.839-6.884 4.991-4.991 6.884-1.839 6.884 1.839 4.991 4.991 1.839 6.884z"></path> </symbol> <symbol id="icon-soundcloud" viewBox="0 0 41 32"> <path class="path1" d="M14 24.5l0.286-4.304-0.286-9.339q-0.018-0.179-0.134-0.304t-0.295-0.125q-0.161 0-0.286 0.125t-0.125 0.304l-0.25 9.339 0.25 4.304q0.018 0.179 0.134 0.295t0.277 0.116q0.393 0 0.429-0.411zM19.286 23.982l0.196-3.768-0.214-10.464q0-0.286-0.232-0.429-0.143-0.089-0.286-0.089t-0.286 0.089q-0.232 0.143-0.232 0.429l-0.018 0.107-0.179 10.339q0 0.018 0.196 4.214v0.018q0 0.179 0.107 0.304 0.161 0.196 0.411 0.196 0.196 0 0.357-0.161 0.161-0.125 0.161-0.357zM0.625 17.911l0.357 2.286-0.357 2.25q-0.036 0.161-0.161 0.161t-0.161-0.161l-0.304-2.25 0.304-2.286q0.036-0.161 0.161-0.161t0.161 0.161zM2.161 16.5l0.464 3.696-0.464 3.625q-0.036 0.161-0.179 0.161-0.161 0-0.161-0.179l-0.411-3.607 0.411-3.696q0-0.161 0.161-0.161 0.143 0 0.179 0.161zM3.804 15.821l0.446 4.375-0.446 4.232q0 0.196-0.196 0.196-0.179 0-0.214-0.196l-0.375-4.232 0.375-4.375q0.036-0.214 0.214-0.214 0.196 0 0.196 0.214zM5.482 15.696l0.411 4.5-0.411 4.357q-0.036 0.232-0.25 0.232-0.232 0-0.232-0.232l-0.375-4.357 0.375-4.5q0-0.232 0.232-0.232 0.214 0 0.25 0.232zM7.161 16.018l0.375 4.179-0.375 4.393q-0.036 0.286-0.286 0.286-0.107 0-0.188-0.080t-0.080-0.205l-0.357-4.393 0.357-4.179q0-0.107 0.080-0.188t0.188-0.080q0.25 0 0.286 0.268zM8.839 13.411l0.375 6.786-0.375 4.393q0 0.125-0.089 0.223t-0.214 0.098q-0.286 0-0.321-0.321l-0.321-4.393 0.321-6.786q0.036-0.321 0.321-0.321 0.125 0 0.214 0.098t0.089 0.223zM10.518 11.875l0.339 8.357-0.339 4.357q0 0.143-0.098 0.241t-0.241 0.098q-0.321 0-0.357-0.339l-0.286-4.357 0.286-8.357q0.036-0.339 0.357-0.339 0.143 0 0.241 0.098t0.098 0.241zM12.268 11.161l0.321 9.036-0.321 4.321q-0.036 0.375-0.393 0.375-0.339 0-0.375-0.375l-0.286-4.321 0.286-9.036q0-0.161 0.116-0.277t0.259-0.116q0.161 0 0.268 0.116t0.125 0.277zM19.268 24.411v0 0zM15.732 11.089l0.268 9.107-0.268 4.268q0 0.179-0.134 0.313t-0.313 0.134-0.304-0.125-0.143-0.321l-0.25-4.268 0.25-9.107q0-0.196 0.134-0.321t0.313-0.125 0.313 0.125 0.134 0.321zM17.5 11.429l0.25 8.786-0.25 4.214q0 0.196-0.143 0.339t-0.339 0.143-0.339-0.143-0.161-0.339l-0.214-4.214 0.214-8.786q0.018-0.214 0.161-0.357t0.339-0.143 0.33 0.143 0.152 0.357zM21.286 20.214l-0.25 4.125q0 0.232-0.161 0.393t-0.393 0.161-0.393-0.161-0.179-0.393l-0.107-2.036-0.107-2.089 0.214-11.357v-0.054q0.036-0.268 0.214-0.429 0.161-0.125 0.357-0.125 0.143 0 0.268 0.089 0.25 0.143 0.286 0.464zM41.143 19.875q0 2.089-1.482 3.563t-3.571 1.473h-14.036q-0.232-0.036-0.393-0.196t-0.161-0.393v-16.054q0-0.411 0.5-0.589 1.518-0.607 3.232-0.607 3.482 0 6.036 2.348t2.857 5.777q0.946-0.393 1.964-0.393 2.089 0 3.571 1.482t1.482 3.589z"></path> </symbol> <symbol id="icon-codepen" viewBox="0 0 32 32"> <path class="path1" d="M3.857 20.875l10.768 7.179v-6.411l-5.964-3.982zM2.75 18.304l3.446-2.304-3.446-2.304v4.607zM17.375 28.054l10.768-7.179-4.804-3.214-5.964 3.982v6.411zM16 19.25l4.857-3.25-4.857-3.25-4.857 3.25zM8.661 14.339l5.964-3.982v-6.411l-10.768 7.179zM25.804 16l3.446 2.304v-4.607zM23.339 14.339l4.804-3.214-10.768-7.179v6.411zM32 11.125v9.75q0 0.732-0.607 1.143l-14.625 9.75q-0.375 0.232-0.768 0.232t-0.768-0.232l-14.625-9.75q-0.607-0.411-0.607-1.143v-9.75q0-0.732 0.607-1.143l14.625-9.75q0.375-0.232 0.768-0.232t0.768 0.232l14.625 9.75q0.607 0.411 0.607 1.143z"></path> </symbol> <symbol id="icon-twitch" viewBox="0 0 32 32"> <path class="path1" d="M16 7.75v7.75h-2.589v-7.75h2.589zM23.107 7.75v7.75h-2.589v-7.75h2.589zM23.107 21.321l4.518-4.536v-14.196h-21.321v18.732h5.821v3.875l3.875-3.875h7.107zM30.214 0v18.089l-7.75 7.75h-5.821l-3.875 3.875h-3.875v-3.875h-7.107v-20.679l1.946-5.161h26.482z"></path> </symbol> <symbol id="icon-meanpath" viewBox="0 0 27 32"> <path class="path1" d="M23.411 15.036v2.036q0 0.429-0.241 0.679t-0.67 0.25h-3.607q-0.429 0-0.679-0.25t-0.25-0.679v-2.036q0-0.429 0.25-0.679t0.679-0.25h3.607q0.429 0 0.67 0.25t0.241 0.679zM14.661 19.143v-4.464q0-0.946-0.58-1.527t-1.527-0.58h-2.375q-1.214 0-1.714 0.929-0.5-0.929-1.714-0.929h-2.321q-0.946 0-1.527 0.58t-0.58 1.527v4.464q0 0.393 0.375 0.393h0.982q0.393 0 0.393-0.393v-4.107q0-0.429 0.241-0.679t0.688-0.25h1.679q0.429 0 0.679 0.25t0.25 0.679v4.107q0 0.393 0.375 0.393h0.964q0.393 0 0.393-0.393v-4.107q0-0.429 0.25-0.679t0.679-0.25h1.732q0.429 0 0.67 0.25t0.241 0.679v4.107q0 0.393 0.393 0.393h0.982q0.375 0 0.375-0.393zM25.179 17.429v-2.75q0-0.946-0.589-1.527t-1.536-0.58h-4.714q-0.946 0-1.536 0.58t-0.589 1.527v7.321q0 0.375 0.393 0.375h0.982q0.375 0 0.375-0.375v-3.214q0.554 0.75 1.679 0.75h3.411q0.946 0 1.536-0.58t0.589-1.527zM27.429 6.429v19.143q0 1.714-1.214 2.929t-2.929 1.214h-19.143q-1.714 0-2.929-1.214t-1.214-2.929v-19.143q0-1.714 1.214-2.929t2.929-1.214h19.143q1.714 0 2.929 1.214t1.214 2.929z"></path> </symbol> <symbol id="icon-pinterest-p" viewBox="0 0 23 32"> <path class="path1" d="M0 10.661q0-1.929 0.67-3.634t1.848-2.973 2.714-2.196 3.304-1.393 3.607-0.464q2.821 0 5.25 1.188t3.946 3.455 1.518 5.125q0 1.714-0.339 3.357t-1.071 3.161-1.786 2.67-2.589 1.839-3.375 0.688q-1.214 0-2.411-0.571t-1.714-1.571q-0.179 0.696-0.5 2.009t-0.42 1.696-0.366 1.268-0.464 1.268-0.571 1.116-0.821 1.384-1.107 1.545l-0.25 0.089-0.161-0.179q-0.268-2.804-0.268-3.357 0-1.643 0.384-3.688t1.188-5.134 0.929-3.625q-0.571-1.161-0.571-3.018 0-1.482 0.929-2.786t2.357-1.304q1.089 0 1.696 0.723t0.607 1.83q0 1.179-0.786 3.411t-0.786 3.339q0 1.125 0.804 1.866t1.946 0.741q0.982 0 1.821-0.446t1.402-1.214 1-1.696 0.679-1.973 0.357-1.982 0.116-1.777q0-3.089-1.955-4.813t-5.098-1.723q-3.571 0-5.964 2.313t-2.393 5.866q0 0.786 0.223 1.518t0.482 1.161 0.482 0.813 0.223 0.545q0 0.5-0.268 1.304t-0.661 0.804q-0.036 0-0.304-0.054-0.911-0.268-1.616-1t-1.089-1.688-0.58-1.929-0.196-1.902z"></path> </symbol> <symbol id="icon-periscope" viewBox="0 0 24 28"> <path class="path1" d="M12.285,1C6.696,1,2.277,5.643,2.277,11.243c0,5.851,7.77,14.578,10.007,14.578c1.959,0,9.729-8.728,9.729-14.578 C22.015,5.643,17.596,1,12.285,1z M12.317,16.551c-3.473,0-6.152-2.611-6.152-5.664c0-1.292,0.39-2.472,1.065-3.438 c0.206,1.084,1.18,1.906,2.352,1.906c1.322,0,2.393-1.043,2.393-2.333c0-0.832-0.447-1.561-1.119-1.975 c0.467-0.105,0.955-0.161,1.46-0.161c3.133,0,5.81,2.611,5.81,5.998C18.126,13.94,15.449,16.551,12.317,16.551z"></path> </symbol> <symbol id="icon-get-pocket" viewBox="0 0 31 32"> <path class="path1" d="M27.946 2.286q1.161 0 1.964 0.813t0.804 1.973v9.268q0 3.143-1.214 6t-3.259 4.911-4.893 3.259-5.973 1.205q-3.143 0-5.991-1.205t-4.902-3.259-3.268-4.911-1.214-6v-9.268q0-1.143 0.821-1.964t1.964-0.821h25.161zM15.375 21.286q0.839 0 1.464-0.589l7.214-6.929q0.661-0.625 0.661-1.518 0-0.875-0.616-1.491t-1.491-0.616q-0.839 0-1.464 0.589l-5.768 5.536-5.768-5.536q-0.625-0.589-1.446-0.589-0.875 0-1.491 0.616t-0.616 1.491q0 0.911 0.643 1.518l7.232 6.929q0.589 0.589 1.446 0.589z"></path> </symbol> <symbol id="icon-vimeo" viewBox="0 0 32 32"> <path class="path1" d="M30.518 9.25q-0.179 4.214-5.929 11.625-5.946 7.696-10.036 7.696-2.536 0-4.286-4.696-0.786-2.857-2.357-8.607-1.286-4.679-2.804-4.679-0.321 0-2.268 1.357l-1.375-1.75q0.429-0.375 1.929-1.723t2.321-2.063q2.786-2.464 4.304-2.607 1.696-0.161 2.732 0.991t1.446 3.634q0.786 5.125 1.179 6.661 0.982 4.446 2.143 4.446 0.911 0 2.75-2.875 1.804-2.875 1.946-4.393 0.232-2.482-1.946-2.482-1.018 0-2.161 0.464 2.143-7.018 8.196-6.821 4.482 0.143 4.214 5.821z"></path> </symbol> <symbol id="icon-reddit-alien" viewBox="0 0 32 32"> <path class="path1" d="M32 15.107q0 1.036-0.527 1.884t-1.42 1.295q0.214 0.821 0.214 1.714 0 2.768-1.902 5.125t-5.188 3.723-7.143 1.366-7.134-1.366-5.179-3.723-1.902-5.125q0-0.839 0.196-1.679-0.911-0.446-1.464-1.313t-0.554-1.902q0-1.464 1.036-2.509t2.518-1.045q1.518 0 2.589 1.125 3.893-2.714 9.196-2.893l2.071-9.304q0.054-0.232 0.268-0.375t0.464-0.089l6.589 1.446q0.321-0.661 0.964-1.063t1.411-0.402q1.107 0 1.893 0.777t0.786 1.884-0.786 1.893-1.893 0.786-1.884-0.777-0.777-1.884l-5.964-1.321-1.857 8.429q5.357 0.161 9.268 2.857 1.036-1.089 2.554-1.089 1.482 0 2.518 1.045t1.036 2.509zM7.464 18.661q0 1.107 0.777 1.893t1.884 0.786 1.893-0.786 0.786-1.893-0.786-1.884-1.893-0.777q-1.089 0-1.875 0.786t-0.786 1.875zM21.929 25q0.196-0.196 0.196-0.464t-0.196-0.464q-0.179-0.179-0.446-0.179t-0.464 0.179q-0.732 0.75-2.161 1.107t-2.857 0.357-2.857-0.357-2.161-1.107q-0.196-0.179-0.464-0.179t-0.446 0.179q-0.196 0.179-0.196 0.455t0.196 0.473q0.768 0.768 2.116 1.214t2.188 0.527 1.625 0.080 1.625-0.080 2.188-0.527 2.116-1.214zM21.875 21.339q1.107 0 1.884-0.786t0.777-1.893q0-1.089-0.786-1.875t-1.875-0.786q-1.107 0-1.893 0.777t-0.786 1.884 0.786 1.893 1.893 0.786z"></path> </symbol> <symbol id="icon-hashtag" viewBox="0 0 32 32"> <path class="path1" d="M17.696 18.286l1.143-4.571h-4.536l-1.143 4.571h4.536zM31.411 9.286l-1 4q-0.125 0.429-0.554 0.429h-5.839l-1.143 4.571h5.554q0.268 0 0.446 0.214 0.179 0.25 0.107 0.5l-1 4q-0.089 0.429-0.554 0.429h-5.839l-1.446 5.857q-0.125 0.429-0.554 0.429h-4q-0.286 0-0.464-0.214-0.161-0.214-0.107-0.5l1.393-5.571h-4.536l-1.446 5.857q-0.125 0.429-0.554 0.429h-4.018q-0.268 0-0.446-0.214-0.161-0.214-0.107-0.5l1.393-5.571h-5.554q-0.268 0-0.446-0.214-0.161-0.214-0.107-0.5l1-4q0.125-0.429 0.554-0.429h5.839l1.143-4.571h-5.554q-0.268 0-0.446-0.214-0.179-0.25-0.107-0.5l1-4q0.089-0.429 0.554-0.429h5.839l1.446-5.857q0.125-0.429 0.571-0.429h4q0.268 0 0.446 0.214 0.161 0.214 0.107 0.5l-1.393 5.571h4.536l1.446-5.857q0.125-0.429 0.571-0.429h4q0.268 0 0.446 0.214 0.161 0.214 0.107 0.5l-1.393 5.571h5.554q0.268 0 0.446 0.214 0.161 0.214 0.107 0.5z"></path> </symbol> <symbol id="icon-chain" viewBox="0 0 30 32"> <path class="path1" d="M26 21.714q0-0.714-0.5-1.214l-3.714-3.714q-0.5-0.5-1.214-0.5-0.75 0-1.286 0.571 0.054 0.054 0.339 0.33t0.384 0.384 0.268 0.339 0.232 0.455 0.063 0.491q0 0.714-0.5 1.214t-1.214 0.5q-0.268 0-0.491-0.063t-0.455-0.232-0.339-0.268-0.384-0.384-0.33-0.339q-0.589 0.554-0.589 1.304 0 0.714 0.5 1.214l3.679 3.696q0.482 0.482 1.214 0.482 0.714 0 1.214-0.464l2.625-2.607q0.5-0.5 0.5-1.196zM13.446 9.125q0-0.714-0.5-1.214l-3.679-3.696q-0.5-0.5-1.214-0.5-0.696 0-1.214 0.482l-2.625 2.607q-0.5 0.5-0.5 1.196 0 0.714 0.5 1.214l3.714 3.714q0.482 0.482 1.214 0.482 0.75 0 1.286-0.554-0.054-0.054-0.339-0.33t-0.384-0.384-0.268-0.339-0.232-0.455-0.063-0.491q0-0.714 0.5-1.214t1.214-0.5q0.268 0 0.491 0.063t0.455 0.232 0.339 0.268 0.384 0.384 0.33 0.339q0.589-0.554 0.589-1.304zM29.429 21.714q0 2.143-1.518 3.625l-2.625 2.607q-1.482 1.482-3.625 1.482-2.161 0-3.643-1.518l-3.679-3.696q-1.482-1.482-1.482-3.625 0-2.196 1.571-3.732l-1.571-1.571q-1.536 1.571-3.714 1.571-2.143 0-3.643-1.5l-3.714-3.714q-1.5-1.5-1.5-3.643t1.518-3.625l2.625-2.607q1.482-1.482 3.625-1.482 2.161 0 3.643 1.518l3.679 3.696q1.482 1.482 1.482 3.625 0 2.196-1.571 3.732l1.571 1.571q1.536-1.571 3.714-1.571 2.143 0 3.643 1.5l3.714 3.714q1.5 1.5 1.5 3.643z"></path> </symbol> <symbol id="icon-thumb-tack" viewBox="0 0 21 32"> <path class="path1" d="M8.571 15.429v-8q0-0.25-0.161-0.411t-0.411-0.161-0.411 0.161-0.161 0.411v8q0 0.25 0.161 0.411t0.411 0.161 0.411-0.161 0.161-0.411zM20.571 21.714q0 0.464-0.339 0.804t-0.804 0.339h-7.661l-0.911 8.625q-0.036 0.214-0.188 0.366t-0.366 0.152h-0.018q-0.482 0-0.571-0.482l-1.357-8.661h-7.214q-0.464 0-0.804-0.339t-0.339-0.804q0-2.196 1.402-3.955t3.17-1.759v-9.143q-0.929 0-1.607-0.679t-0.679-1.607 0.679-1.607 1.607-0.679h11.429q0.929 0 1.607 0.679t0.679 1.607-0.679 1.607-1.607 0.679v9.143q1.768 0 3.17 1.759t1.402 3.955z"></path> </symbol> <symbol id="icon-arrow-left" viewBox="0 0 43 32"> <path class="path1" d="M42.311 14.044c-0.178-0.178-0.533-0.356-0.711-0.356h-33.778l10.311-10.489c0.178-0.178 0.356-0.533 0.356-0.711 0-0.356-0.178-0.533-0.356-0.711l-1.6-1.422c-0.356-0.178-0.533-0.356-0.889-0.356s-0.533 0.178-0.711 0.356l-14.578 14.933c-0.178 0.178-0.356 0.533-0.356 0.711s0.178 0.533 0.356 0.711l14.756 14.933c0 0.178 0.356 0.356 0.533 0.356s0.533-0.178 0.711-0.356l1.6-1.6c0.178-0.178 0.356-0.533 0.356-0.711s-0.178-0.533-0.356-0.711l-10.311-10.489h33.778c0.178 0 0.533-0.178 0.711-0.356 0.356-0.178 0.533-0.356 0.533-0.711v-2.133c0-0.356-0.178-0.711-0.356-0.889z"></path> </symbol> <symbol id="icon-arrow-right" viewBox="0 0 43 32"> <path class="path1" d="M0.356 17.956c0.178 0.178 0.533 0.356 0.711 0.356h33.778l-10.311 10.489c-0.178 0.178-0.356 0.533-0.356 0.711 0 0.356 0.178 0.533 0.356 0.711l1.6 1.6c0.178 0.178 0.533 0.356 0.711 0.356s0.533-0.178 0.711-0.356l14.756-14.933c0.178-0.356 0.356-0.711 0.356-0.889s-0.178-0.533-0.356-0.711l-14.756-14.933c0-0.178-0.356-0.356-0.533-0.356s-0.533 0.178-0.711 0.356l-1.6 1.6c-0.178 0.178-0.356 0.533-0.356 0.711s0.178 0.533 0.356 0.711l10.311 10.489h-33.778c-0.178 0-0.533 0.178-0.711 0.356-0.356 0.178-0.533 0.356-0.533 0.711v2.311c0 0.178 0.178 0.533 0.356 0.711z"></path> </symbol> <symbol id="icon-play" viewBox="0 0 22 28"> <path d="M21.625 14.484l-20.75 11.531c-0.484 0.266-0.875 0.031-0.875-0.516v-23c0-0.547 0.391-0.781 0.875-0.516l20.75 11.531c0.484 0.266 0.484 0.703 0 0.969z"></path> </symbol> <symbol id="icon-pause" viewBox="0 0 24 28"> <path d="M24 3v22c0 0.547-0.453 1-1 1h-8c-0.547 0-1-0.453-1-1v-22c0-0.547 0.453-1 1-1h8c0.547 0 1 0.453 1 1zM10 3v22c0 0.547-0.453 1-1 1h-8c-0.547 0-1-0.453-1-1v-22c0-0.547 0.453-1 1-1h8c0.547 0 1 0.453 1 1z"></path> </symbol> </defs> </svg> <script src="https://bilderupload.net/wp-content/cache/min/1/b5213d67c5d0e754e158f677f523f49c.js" data-minify="1" defer></script><noscript><link rel='stylesheet' id='wp-block-library-css' href='https://bilderupload.net/wp-includes/css/dist/block-library/style.min.css?ver=6.5.2' type='text/css' media='all' /><link rel='stylesheet' id='blog-tales-google-fonts-css' href='https://fonts.googleapis.com/css?family=Playfair+Display%3A400%2C700%7CLato%3A400%2C700&subset=latin%2Clatin-ext' type='text/css' media='all' /><link data-minify="1" rel='stylesheet' id='blog-tales-blocks-css' href='https://bilderupload.net/wp-content/cache/min/1/wp-content/themes/blog-tales/assets/css/blocks.css?ver=1706965618' type='text/css' media='all' /><link data-minify="1" rel='stylesheet' id='blog-tales-style-css' href='https://bilderupload.net/wp-content/cache/min/1/wp-content/themes/blog-tales/style.css?ver=1706965618' type='text/css' media='all' /></noscript></body> </html> <!-- This website is like a Rocket, isn't it? Performance optimized by WP Rocket. Learn more: https://wp-rocket.me - Debug: cached@1713017076 -->