Parsing Html With Beautifulsoup

Guide to Parsing HTML with BeautifulSoup in Python – Stack …

Introduction
Web scraping is programmatically collecting information from various websites. While there are many libraries and frameworks in various languages that can extract web data, Python has long been a popular choice because of its plethora of options for web scraping.
This article will give you a crash course on web scraping in Python with Beautiful Soup – a popular Python library for parsing HTML and XML.
Ethical Web Scraping
Web scraping is ubiquitous and gives us data as we would get with an API. However, as good citizens of the internet, it’s our responsibility to respect the site owners we scrape from. Here are some principles that a web scraper should adhere to:
Don’t claim scraped content as our own. Website owners sometimes spend a lengthy amount of time creating articles, collecting details about products or harvesting other content. We must respect their labor and originality.
Don’t scrape a website that doesn’t want to be scraped. Websites sometimes come with a file – which defines the parts of a website that can be scraped. Many websites also have a Terms of Use which may not allow scraping. We must respect websites that do not want to be scraped.
Is there an API available already? Splendid, there’s no need for us to write a scraper. APIs are created to provide access to data in a controlled way as defined by the owners of the data. We prefer to use APIs if they’re available.
Making requests to a website can cause a toll on a website’s performance. A web scraper that makes too many requests can be as debilitating as a DDOS attack. We must scrape responsibly so we won’t cause any disruption to the regular functioning of the website.
An Overview of Beautiful Soup
The HTML content of the webpages can be parsed and scraped with Beautiful Soup. In the following section, we will be covering those functions that are useful for scraping webpages.
What makes Beautiful Soup so useful is the myriad functions it provides to extract data from HTML. This image below illustrates some of the functions we can use:
Let’s get hands-on and see how we can parse HTML with Beautiful Soup. Consider the following HTML page saved to file as


Head’s title

Body’s title

line begins
1
2
3

line ends



The following code snippets are tested on Ubuntu 20. 04. 1 LTS. You can install the BeautifulSoup module by typing the following command in the terminal:
$ pip3 install beautifulsoup4
The HTML file needs to be prepared. This is done by passing the file to the BeautifulSoup constructor, let’s use the interactive Python shell for this, so we can instantly print the contents of a specific part of a page:
from bs4 import BeautifulSoup
with open(“”) as fp:
soup = BeautifulSoup(fp, “”)
Now we can use Beautiful Soup to navigate our website and extract data.
Navigating to Specific Tags
From the soup object created in the previous section, let’s get the title tag of
# returns Head’s title
Here’s a breakdown of each component we used to get the title:
Beautiful Soup is powerful because our Python objects match the nested structure of the HTML document we are scraping.
To get the text of the first tag, enter this:
# returns ‘1’
To get the title within the HTML’s body tag (denoted by the “title” class), type the following in your terminal:
# returns Body’s title
For deeply nested HTML documents, navigation could quickly become tedious. Luckily, Beautiful Soup comes with a search function so we don’t have to navigate to retrieve HTML elements.
Searching the Elements of Tags
The find_all() method takes an HTML tag as a string argument and returns the list of elements that match with the provided tag. For example, if we want all a tags in
nd_all(“a”)
We’ll see this list of a tags as output:
[
1, 2, 3]
Here’s a breakdown of each component we used to search for a tag:
We can search for tags of a specific class as well by providing the class_ argument. Beautiful Soup uses class_ because class is a reserved keyword in Python. Let’s search for all a tags that have the “element” class:
nd_all(“a”, class_=”element”)
As we only have two links with the “element” class, you’ll see this output:
[1, 2]
What if we wanted to fetch the links embedded inside the a tags? Let’s retrieve a link’s href attribute using the find() option. It works just like find_all() but it returns the first matching element instead of a list. Type this in your shell:
(“a”, href=True)[“href”] # returns
The find() and find_all() functions also accept a regular expression instead of a string. Behind the scenes, the text will be filtered using the compiled regular expression’s search() method. For example:
import re
for tag in nd_all(mpile(“^b”)):
print(tag)
The list upon iteration, fetches the tags starting with the character b which includes and :
1
2
3


Body’s title
Check out our hands-on, practical guide to learning Git, with best-practices, industry-accepted standards, and included cheat sheet. Stop Googling Git commands and actually learn it! We’ve covered the most popular ways to get tags and their attributes. Sometimes, especially for less dynamic web pages, we just want the text from it. Let’s see how we can get it!
Getting the Whole Text
The get_text() function retrieves all the text from the HTML document. Let’s get all the text of the HTML document:
t_text()
Your output should be like this:
Head’s title
Body’s title
line begins
1
2
3
line ends
Sometimes the newline characters are printed, so your output may look like this as well:
“\n\nHead’s title\n\n\nBody’s title\nline begins\n 1\n2\n3\n line ends\n\n”
Now that we have a feel for how to use Beautiful Soup, let’s scrape a website!
Beautiful Soup in Action – Scraping a Book List
Now that we have mastered the components of Beautiful Soup, it’s time to put our learning to use. Let’s build a scraper to extract data from and save it to a CSV file. The site contains random data about books and is a great space to test out your web scraping techniques.
First, create a new file called Let’s import all the libraries we need for this script:
import requests
import time
import csv
In the modules mentioned above:
requests – performs the URL request and fetches the website’s HTML
time – limits how many times we scrape the page at once
csv – helps us export our scraped data to a CSV file
re – allows us to write regular expressions that will come in handy for picking text based on its pattern
bs4 – yours truly, the scraping module to parse the HTML
You would have bs4 already installed, and time, csv, and re are built-in packages in Python. You’ll need to install the requests module directly like this:
$ pip3 install requests
Before you begin, you need to understand how the webpage’s HTML is structured. In your browser, let’s go to. Then right-click on the components of the webpage to be scraped, and click on the inspect button to understand the hierarchy of the tags as shown below.
This will show you the underlying HTML for what you’re inspecting. The following picture illustrates these steps:
From inspecting the HTML, we learn how to access the URL of the book, the cover image, the title, the rating, the price, and more fields from the HTML. Let’s write a function that scrapes a book item and extract its data:
def scrape(source_url, soup): # Takes the driver and the subdomain for concats as params
# Find the elements of the article tag
books = nd_all(“article”, class_=”product_pod”)
# Iterate over each book article tag
for each_book in books:
info_url = source_url+”/”(“a”)[“href”]
cover_url = source_url+”/catalogue” + \
[“src”]. replace(“.. “, “”)
title = (“a”)[“title”]
rating = (“p”, class_=”star-rating”)[“class”][1]
# can also be written as: (“a”)(“title”)
price = (“p”, class_=”price_color”)()(
“ascii”, “ignore”)(“ascii”)
availability = (
“p”, class_=”instock availability”)()
# Invoke the write_to_csv function
write_to_csv([info_url, cover_url, title, rating, price, availability])
The last line of the above snippet points to a function to write the list of scraped strings to a CSV file. Let’s add that function now:
def write_to_csv(list_input):
# The scraped info will be written to a CSV here.
try:
with open(“”, “a”) as fopen: # Open the csv file.
csv_writer = (fopen)
csv_writer. writerow(list_input)
except:
return False
As we have a function that can scrape a page and export to CSV, we want another function that crawls through the paginated website, collecting book data on each page.
To do this, let’s look at the URL we are writing this scraper for:

The only varying element in the URL is the page number. We can format the URL dynamically so it becomes a seed URL:
“}”(str(page_number))
This string formatted URL with the page number can be fetched using the method (). We can then create a new BeautifulSoup object. Every time we get the soup object, the presence of the “next” button is checked so we could stop at the last page. We keep track of a counter for the page number that’s incremented by 1 after successfully scraping a page.
def browse_and_scrape(seed_url, page_number=1):
# Fetch the URL – We will be using this to append to images and info routes
url_pat = mpile(r”(. *\)”)
source_url = (seed_url)(0)
# Page_number from the argument gets formatted in the URL & Fetched
formatted_url = (str(page_number))
html_text = (formatted_url)
# Prepare the soup
soup = BeautifulSoup(html_text, “”)
print(f”Now Scraping – {formatted_url}”)
# This if clause stops the script when it hits an empty page
if (“li”, class_=”next”)! = None:
scrape(source_url, soup) # Invoke the scrape function
# Be a responsible citizen by waiting before you hit again
(3)
page_number += 1
# Recursively invoke the same function with the increment
browse_and_scrape(seed_url, page_number)
else:
scrape(source_url, soup) # The script exits here
return True
except Exception as e:
return e
The function above, browse_and_scrape(), is recursively called until the function (“li”, class_=”next”) returns None. At this point, the code will scrape the remaining part of the webpage and exit.
For the final piece to the puzzle, we initiate the scraping flow. We define the seed_url and call the browse_and_scrape() to get the data. This is done under the if __name__ == “__main__” block:
if __name__ == “__main__”:
seed_url = “}”
print(“Web scraping has begun”)
result = browse_and_scrape(seed_url)
if result == True:
print(“Web scraping is now complete! “)
print(f”Oops, That doesn’t seem right!!! – {result}”)
If you’d like to learn more about the if __name__ == “__main__” block, check out our guide on how it works.
You can execute the script as shown below in your terminal and get the output as:
$ python
Web scraping has begun
Now Scraping – Now Scraping – Now Scraping -…
Now Scraping – Now Scraping – Web scraping is now complete!
The scraped data can be found in the current working directory under the filename Here’s a sample the file’s content:
Light in the Attic, Three, 51. 77, In stock
the Velvet, One, 53. 74, In stock
stock
Good job! If you wanted to have a look at the scraper code as a whole, you can find it on GitHub.
Conclusion
In this tutorial, we learned the ethics of writing good web scrapers. We then used Beautiful Soup to extract data from an HTML file using the Beautiful Soup’s object properties, and it’s various methods like find(), find_all() and get_text(). We then built a scraper than retrieves a book list online and exports to CSV.
Web scraping is a useful skill that helps in various activities such as extracting data like an API, performing QA on a website, checking for broken URLs on a website, and more. What’s the next scraper you’re going to build?
Beautiful Soup 4.9.0 documentation - Crummy

Beautiful Soup 4.9.0 documentation – Crummy

Beautiful Soup is a
Python library for pulling data out of HTML and XML files. It works
with your favorite parser to provide idiomatic ways of navigating,
searching, and modifying the parse tree. It commonly saves programmers
hours or days of work.
These instructions illustrate all major features of Beautiful Soup 4,
with examples. I show you what the library is good for, how it works,
how to use it, how to make it do what you want, and what to do when it
violates your expectations.
This document covers Beautiful Soup version 4. 9. 3. The examples in
this documentation should work the same way in Python 2. 7 and Python
3. 8.
You might be looking for the documentation for Beautiful Soup 3.
If so, you should know that Beautiful Soup 3 is no longer being
developed and that support for it will be dropped on or after December
31, 2020. If you want to learn about the differences between Beautiful
Soup 3 and Beautiful Soup 4, see Porting code to BS4.
This documentation has been translated into other languages by
Beautiful Soup users:
这篇文档当然还有中文版.
このページは日本語で利用できます(外部リンク)
이 문서는 한국어 번역도 가능합니다.
Este documento também está disponível em Português do Brasil.
Эта документация доступна на русском языке.
Getting help¶
If you have questions about Beautiful Soup, or run into problems,
send mail to the discussion group. If
your problem involves parsing an HTML document, be sure to mention
what the diagnose() function says about
that document.
Here’s an HTML document I’ll be using as an example throughout this
document. It’s part of a story from Alice in Wonderland:
html_doc = “””The Dormouse’s story

The Dormouse’s story

Once upon a time there were three little sisters; and their names were
Elsie,
Lacie and
Tillie;
and they lived at the bottom of a well.

“””
Running the “three sisters” document through Beautiful Soup gives us a
BeautifulSoup object, which represents the document as a nested
data structure:
from bs4 import BeautifulSoup
soup = BeautifulSoup(html_doc, ”)
print(ettify())
#
#
# <br /> # The Dormouse’s story<br /> #
#
#
#

#
#

#

#

# Once upon a time there were three little sisters; and their names were
#
# Elsie
#

#,
#
# Lacie
# and
#

# Tillie
#; and they lived at the bottom of a well.
#…
#
#
Here are some simple ways to navigate that data structure:
# The Dormouse’s story
# u’title’
# u’The Dormouse’s story’
# u’head’
soup. p
#

The Dormouse’s story

soup. p[‘class’]
soup. a
#
Elsie
nd_all(‘a’)
# [Elsie,
# Lacie,
# Tillie]
(id=”link3″)
# Tillie
One common task is extracting all the URLs found within a page’s tags:
for link in nd_all(‘a’):
print((‘href’))
# # #
Another common task is extracting all the text from a page:
print(t_text())
#
# Elsie,
# Lacie and
# Tillie;
# and they lived at the bottom of a well.
Does this look like what you need? If so, read on.
If you’re using a recent version of Debian or Ubuntu Linux, you can
install Beautiful Soup with the system package manager:
$ apt-get install python-bs4 (for Python 2)
$ apt-get install python3-bs4 (for Python 3)
Beautiful Soup 4 is published through PyPi, so if you can’t install it
with the system packager, you can install it with easy_install or
pip. The package name is beautifulsoup4, and the same package
works on Python 2 and Python 3. Make sure you use the right version of
pip or easy_install for your Python version (these may be named
pip3 and easy_install3 respectively if you’re using Python 3).
$ easy_install beautifulsoup4
$ pip install beautifulsoup4
(The BeautifulSoup package is not what you want. That’s
the previous major release, Beautiful Soup 3. Lots of software uses
BS3, so it’s still available, but if you’re writing new code you
should install beautifulsoup4. )
If you don’t have easy_install or pip installed, you can
download the Beautiful Soup 4 source tarball and
install it with
$ python install
If all else fails, the license for Beautiful Soup allows you to
package the entire library with your application. You can download the
tarball, copy its bs4 directory into your application’s codebase,
and use Beautiful Soup without installing it at all.
I use Python 2. 7 and Python 3. 8 to develop Beautiful Soup, but it
should work with other recent versions.
Problems after installation¶
Beautiful Soup is packaged as Python 2 code. When you install it for
use with Python 3, it’s automatically converted to Python 3 code. If
you don’t install the package, the code won’t be converted. There have
also been reports on Windows machines of the wrong version being
installed.
If you get the ImportError “No module named HTMLParser”, your
problem is that you’re running the Python 2 version of the code under
Python 3.
If you get the ImportError “No module named ”, your
problem is that you’re running the Python 3 version of the code under
Python 2.
In both cases, your best bet is to completely remove the Beautiful
Soup installation from your system (including any directory created
when you unzipped the tarball) and try the installation again.
If you get the SyntaxError “Invalid syntax” on the line
ROOT_TAG_NAME = u'[document]’, you need to convert the Python 2
code to Python 3. You can do this either by installing the package:
$ python3 install
or by manually running Python’s 2to3 conversion script on the
bs4 directory:
$ 2to3-3. 2 -w bs4
Installing a parser¶
Beautiful Soup supports the HTML parser included in Python’s standard
library, but it also supports a number of third-party Python parsers.
One is the lxml parser. Depending on your setup,
you might install lxml with one of these commands:
$ apt-get install python-lxml
$ easy_install lxml
$ pip install lxml
Another alternative is the pure-Python html5lib parser, which parses HTML the way a
web browser does. Depending on your setup, you might install html5lib
with one of these commands:
$ apt-get install python-html5lib
$ easy_install html5lib
$ pip install html5lib
This table summarizes the advantages and disadvantages of each parser library:
Parser
Typical usage
Advantages
Disadvantages
Python’s
BeautifulSoup(markup, “”)
Batteries included
Decent speed
Lenient (As of Python 2. 7. 3
and 3. 2. )
Not as fast as lxml,
less lenient than
html5lib.
lxml’s HTML parser
BeautifulSoup(markup, “lxml”)
Very fast
Lenient
External C dependency
lxml’s XML parser
BeautifulSoup(markup, “lxml-xml”)
BeautifulSoup(markup, “xml”)
The only currently supported
XML parser
html5lib
BeautifulSoup(markup, “html5lib”)
Extremely lenient
Parses pages the same way a
web browser does
Creates valid HTML5
Very slow
External Python
dependency
If you can, I recommend you install and use lxml for speed. If you’re
using a very old version of Python – earlier than 2. 3 or 3. 2 –
it’s essential that you install lxml or html5lib. Python’s built-in
HTML parser is just not very good in those old versions.
Note that if a document is invalid, different parsers will generate
different Beautiful Soup trees for it. See Differences
between parsers for details.
To parse a document, pass it into the BeautifulSoup
constructor. You can pass in a string or an open filehandle:
with open(“”) as fp:
soup = BeautifulSoup(fp, ”)
soup = BeautifulSoup(“a web page“, ”)
First, the document is converted to Unicode, and HTML entities are
converted to Unicode characters:
print(BeautifulSoup(“Sacré bleu! “, “”))
# Sacré bleu!
Beautiful Soup then parses the document using the best available
parser. It will use an HTML parser unless you specifically tell it to
use an XML parser. (See Parsing XML. )
Beautiful Soup transforms a complex HTML document into a complex tree
of Python objects. But you’ll only ever have to deal with about four
kinds of objects: Tag, NavigableString, BeautifulSoup,
and Comment.
Tag¶
A Tag object corresponds to an XML or HTML tag in the original document:
soup = BeautifulSoup(‘Extremely bold‘, ”)
tag = soup. b
type(tag)
#
Tags have a lot of attributes and methods, and I’ll cover most of them
in Navigating the tree and Searching the tree. For now, the most
important features of a tag are its name and attributes.
Name¶
Every tag has a name, accessible as
If you change a tag’s name, the change will be reflected in any HTML
markup generated by Beautiful Soup:
= “blockquote”
tag
#

Extremely bold

Attributes¶
A tag may have any number of attributes. The tag has an attribute “id” whose value is
“boldest”. You can access a tag’s attributes by treating the tag like
a dictionary:
tag = BeautifulSoup(‘bold‘, ”). b
tag[‘id’]
# ‘boldest’
You can access that dictionary directly as
# {‘id’: ‘boldest’}
You can add, remove, and modify a tag’s attributes. Again, this is
done by treating the tag as a dictionary:
tag[‘id’] = ‘verybold’
tag[‘another-attribute’] = 1
#
del tag[‘id’]
del tag[‘another-attribute’]
# bold
# KeyError: ‘id’
(‘id’)
# None
Multi-valued attributes¶
HTML 4 defines a few attributes that can have multiple values. HTML 5
removes a couple of them, but defines a few more. The most common
multi-valued attribute is class (that is, a tag can have more than
one CSS class). Others include rel, rev, accept-charset,
headers, and accesskey. Beautiful Soup presents the value(s)
of a multi-valued attribute as a list:
css_soup = BeautifulSoup(‘

‘, ”)
css_soup. p[‘class’]
# [‘body’]
css_soup = BeautifulSoup(‘

‘, ”)
# [‘body’, ‘strikeout’]
If an attribute looks like it has more than one value, but it’s not
a multi-valued attribute as defined by any version of the HTML
standard, Beautiful Soup will leave the attribute alone:
id_soup = BeautifulSoup(‘

‘, ”)
id_soup. p[‘id’]
# ‘my id’
When you turn a tag back into a string, multiple attribute values are
consolidated:
rel_soup = BeautifulSoup(‘

Back to the homepage

‘, ”)
rel_soup. a[‘rel’]
# [‘index’]
rel_soup. a[‘rel’] = [‘index’, ‘contents’]
print(rel_soup. p)
#

Back to the homepage

You can disable this by passing multi_valued_attributes=None as a
keyword argument into the BeautifulSoup constructor:
no_list_soup = BeautifulSoup(‘

‘, ”, multi_valued_attributes=None)
no_list_soup. p[‘class’]
# ‘body strikeout’
You can use get_attribute_list to get a value that’s always a
list, whether or not it’s a multi-valued atribute:
t_attribute_list(‘id’)
# [“my id”]
If you parse a document as XML, there are no multi-valued attributes:
xml_soup = BeautifulSoup(‘

‘, ‘xml’)
xml_soup. p[‘class’]
Again, you can configure this using the multi_valued_attributes argument:
class_is_multi= { ‘*’: ‘class’}
xml_soup = BeautifulSoup(‘

‘, ‘xml’, multi_valued_attributes=class_is_multi)
You probably won’t need to do this, but if you do, use the defaults as
a guide. They implement the rules described in the HTML specification:
from er import builder_registry
(‘html’). DEFAULT_CDATA_LIST_ATTRIBUTES
NavigableString¶
A string corresponds to a bit of text within a tag. Beautiful Soup
uses the NavigableString class to contain these bits of text:
# ‘Extremely bold’
type()
#
A NavigableString is just like a Python Unicode string, except
that it also supports some of the features described in Navigating
the tree and Searching the tree. You can convert a
NavigableString to a Unicode string with unicode() (in
Python 2) or str (in Python 3):
unicode_string = str()
unicode_string
type(unicode_string)
#
You can’t edit a string in place, but you can replace one string with
another, using replace_with():
(“No longer bold”)
# No longer bold
NavigableString supports most of the features described in
Navigating the tree and Searching the tree, but not all of
them. In particular, since a string can’t contain anything (the way a
tag may contain a string or another tag), strings don’t support the. contents or attributes, or the find() method.
If you want to use a NavigableString outside of Beautiful Soup,
you should call unicode() on it to turn it into a normal Python
Unicode string. If you don’t, your string will carry around a
reference to the entire Beautiful Soup parse tree, even when you’re
done using Beautiful Soup. This is a big waste of memory.
BeautifulSoup¶
The BeautifulSoup object represents the parsed document as a
whole. For most purposes, you can treat it as a Tag
object. This means it supports most of the methods described in
Navigating the tree and Searching the tree.
You can also pass a BeautifulSoup object into one of the methods
defined in Modifying the tree, just as you would a Tag. This
lets you do things like combine two parsed documents:
doc = BeautifulSoup(“INSERT FOOTER HEREHere’s the footer

“, “xml”)
(text=”INSERT FOOTER HERE”). replace_with(footer)
# ‘INSERT FOOTER HERE’
print(doc)
#
#

Here’s the footer


Since the BeautifulSoup object doesn’t correspond to an actual
HTML or XML tag, it has no name and no attributes. But sometimes it’s
useful to look at its, so it’s been given the special
“[document]”:
Here’s the “Three sisters” HTML document again:
html_doc = “””
The Dormouse’s story
I’ll use this as an example to show you how to move from one part of
a document to another.
Going down¶
Tags may contain strings and other tags. These elements are the tag’s
children. Beautiful Soup provides a lot of different attributes for
navigating and iterating over a tag’s children.
Note that Beautiful Soup strings don’t support any of these
attributes, because a string can’t have children.
Navigating using tag names¶
The simplest way to navigate the parse tree is to say the name of the
tag you want. If you want the tag, just say
# The Dormouse’s story
You can do use this trick again and again to zoom in on a certain part
of the parse tree. This code gets the first tag beneath the tag:
# The Dormouse’s story
Using a tag name as an attribute will give you only the first tag by that
name:
If you need to get all the tags, or anything more complicated
than the first tag with a certain name, you’ll need to use one of the
methods described in Searching the tree, such as find_all():
#
Tillie]. contents and. children¶
A tag’s children are available in a list called. contents:
head_tag =
head_tag
ntents
# [The Dormouse’s story]
title_tag = ntents[0]
title_tag
# [‘The Dormouse’s story’]
The BeautifulSoup object itself has children. In this case, the
tag is the child of the BeautifulSoup object. :
len(ntents)
# 1
ntents[0]
# ‘html’
A string does not have. contents, because it can’t contain
anything:
text = ntents[0]
# AttributeError: ‘NavigableString’ object has no attribute ‘contents’
Instead of getting them as a list, you can iterate over a tag’s
children using the. children generator:
for child in ildren:
print(child)
# The Dormouse’s story. descendants¶
The. children attributes only consider a tag’s
direct children. For instance, the tag has a single direct
child–the tag:<br /> But the <title> tag itself has a child: the string “The Dormouse’s<br /> story”. There’s a sense in which that string is also a child of the<br /> <head> tag. The. descendants attribute lets you iterate over all<br /> of a tag’s children, recursively: its direct children, the children of<br /> its direct children, and so on:<br /> for child in scendants:<br /> The <head> tag has only one child, but it has two descendants: the<br /> <title> tag and the <title> tag’s child. The BeautifulSoup object<br /> only has one direct child (the <html> tag), but it has a whole lot of<br /> descendants:<br /> len(list(ildren))<br /> len(list(scendants))<br /> # 26<br /> ¶<br /> If a tag has only one child, and that child is a NavigableString,<br /> the child is made available as<br /> # ‘The Dormouse’s story’<br /> If a tag’s only child is another tag, and that tag has a, then the parent tag is considered to have the same<br /> as its child:<br /> If a tag contains more than one thing, then it’s not clear what<br /> should refer to, so is defined to be<br /> None:<br /> print()<br /> # None. strings and stripped_strings¶<br /> If there’s more than one thing inside a tag, you can still look at<br /> just the strings. Use the. strings generator:<br /> for string in rings:<br /> print(repr(string))<br /> ‘\n’<br /> # “The Dormouse’s story”<br /> # ‘\n’<br /> # ‘Once upon a time there were three little sisters; and their names were\n’<br /> # ‘Elsie’<br /> # ‘, \n’<br /> # ‘Lacie’<br /> # ‘ and\n’<br /> # ‘Tillie’<br /> # ‘;\nand they lived at the bottom of a well. ‘<br /> # ‘… ‘<br /> These strings tend to have a lot of extra whitespace, which you can<br /> remove by using the. stripped_strings generator instead:<br /> for string in ripped_strings:<br /> # ‘Once upon a time there were three little sisters; and their names were’<br /> # ‘, ‘<br /> # ‘and’<br /> # ‘;\n and they lived at the bottom of a well. ‘<br /> Here, strings consisting entirely of whitespace are ignored, and<br /> whitespace at the beginning and end of strings is removed.<br /> Going up¶<br /> Continuing the “family tree” analogy, every tag and every string has a<br /> parent: the tag that contains it.<br /> You can access an element’s parent with the attribute. In<br /> the example “three sisters” document, the <head> tag is the parent<br /> of the <title> tag:<br /> title_tag =<br /> The title string itself has a parent: the <title> tag that contains<br /> it:<br /> The parent of a top-level tag like <html> is the BeautifulSoup object<br /> itself:<br /> html_tag =<br /> # <class 'autifulSoup'><br /> And the of a BeautifulSoup object is defined as None:<br /> # None. parents¶<br /> You can iterate over all of an element’s parents with. parents. This example uses. parents to travel from an <a> tag<br /> buried deep within the document, to the very top of the document:<br /> link = soup. a<br /> link<br /> for parent in rents:<br /> # p<br /> # body<br /> # html<br /> # [document]<br /> Going sideways¶<br /> Consider a simple document like this:<br /> sibling_soup = BeautifulSoup(“<a><b>text1</b><c>text2</c></b></a>“, ”)<br /> # <a><br /> # text1<br /> # <c><br /> # text2<br /> # </c><br /> The <b> tag and the <c> tag are at the same level: they’re both direct<br /> children of the same tag. We call them siblings. When a document is<br /> pretty-printed, siblings show up at the same indentation level. You<br /> can also use this relationship in the code you write.. next_sibling and. previous_sibling¶<br /> You can use. previous_sibling to navigate<br /> between page elements that are on the same level of the parse tree:<br /> xt_sibling<br /> # <c>text2</c><br /> evious_sibling<br /> # <b>text1</b><br /> The <b> tag has a. next_sibling, but no. previous_sibling,<br /> because there’s nothing before the <b> tag on the same level of the<br /> tree. For the same reason, the <c> tag has a. previous_sibling<br /> but no. next_sibling:<br /> print(evious_sibling)<br /> print(xt_sibling)<br /> The strings “text1” and “text2” are not siblings, because they don’t<br /> have the same parent:<br /> # ‘text1’<br /> In real documents, the. next_sibling or. previous_sibling of a<br /> tag will usually be a string containing whitespace. Going back to the<br /> “three sisters” document:<br /> # <a href=" class="sister" id="link1">Elsie</a><br /> # <a href=" class="sister" id="link2">Lacie</a><br /> # <a href=" class="sister" id="link3">Tillie</a><br /> You might think that the. next_sibling of the first <a> tag would<br /> be the second <a> tag. But actually, it’s a string: the comma and<br /> newline that separate the first <a> tag from the second:<br /> # ‘, \n ‘<br /> The second <a> tag is actually the. next_sibling of the comma:<br /> # <a class="sister" href=" id="link2">Lacie</a>. next_siblings and. previous_siblings¶<br /> You can iterate over a tag’s siblings with. next_siblings or. previous_siblings:<br /> for sibling in xt_siblings:<br /> print(repr(sibling))<br /> # <a class="sister" href=" id="link2">Lacie</a><br /> # ‘; and they lived at the bottom of a well. ‘<br /> for sibling in (id=”link3″). previous_siblings:<br /> Going back and forth¶<br /> Take a look at the beginning of the “three sisters” document:<br /> # <html><head><title>The Dormouse’s story
An HTML parser takes this string of characters and turns it into a
series of events: “open an tag”, “open a tag”, “open a
tag”, “add a string”, “close the <title> tag”, “open a </p> <p> tag”, and so on. Beautiful Soup offers tools for reconstructing the<br /> initial parse of the document.. next_element and. previous_element¶<br /> The. next_element attribute of a string or tag points to whatever<br /> was parsed immediately afterwards. It might be the same as. next_sibling, but it’s usually drastically different.<br /> Here’s the final <a> tag in the “three sisters” document. Its. next_sibling is a string: the conclusion of the sentence that was<br /> interrupted by the start of the <a> tag. :<br /> last_a_tag = (“a”, id=”link3″)<br /> last_a_tag<br /> But the. next_element of that <a> tag, the thing that was parsed<br /> immediately after the <a> tag, is not the rest of that sentence:<br /> it’s the word “Tillie”:<br /> xt_element<br /> That’s because in the original markup, the word “Tillie” appeared<br /> before that semicolon. The parser encountered an <a> tag, then the<br /> word “Tillie”, then the closing </a> tag, then the semicolon and rest of<br /> the sentence. The semicolon is on the same level as the <a> tag, but the<br /> word “Tillie” was encountered first.<br /> The. previous_element attribute is the exact opposite of. next_element. It points to whatever element was parsed<br /> immediately before this one:<br /> evious_element<br /> # <a class="sister" href=" id="link3">Tillie</a>. next_elements and. previous_elements¶<br /> You should get the idea by now. You can use these iterators to move<br /> forward or backward in the document as it was parsed:<br /> for element in xt_elements:<br /> print(repr(element))<br /> # </p> <p class="story">… </p> <p>Beautiful Soup defines a lot of methods for searching the parse tree,<br /> but they’re all very similar. I’m going to spend a lot of time explaining<br /> the two most popular methods: find() and find_all(). The other<br /> methods take almost exactly the same arguments, so I’ll just cover<br /> them briefly.<br /> Once again, I’ll be using the “three sisters” document as an example:<br /> By passing in a filter to an argument like find_all(), you can<br /> zoom in on the parts of the document you’re interested in.<br /> Kinds of filters¶<br /> Before talking in detail about find_all() and similar methods, I<br /> want to show examples of different filters you can pass into these<br /> methods. These filters show up again and again, throughout the<br /> search API. You can use them to filter based on a tag’s name,<br /> on its attributes, on the text of a string, or on some combination of<br /> these.<br /> A string¶<br /> The simplest filter is a string. Pass a string to a search method and<br /> Beautiful Soup will perform a match against that exact string. This<br /> code finds all the <b> tags in the document:<br /> nd_all(‘b’)<br /> # [<b>The Dormouse’s story</b>]<br /> If you pass in a byte string, Beautiful Soup will assume the string is<br /> encoded as UTF-8. You can avoid this by passing in a Unicode string instead.<br /> A regular expression¶<br /> If you pass in a regular expression object, Beautiful Soup will filter<br /> against that regular expression using its search() method. This code<br /> finds all the tags whose names start with the letter “b”; in this<br /> case, the <body> tag and the <b> tag:<br /> import re<br /> for tag in nd_all(mpile(“^b”)):<br /> # b<br /> This code finds all the tags whose names contain the letter ‘t’:<br /> for tag in nd_all(mpile(“t”)):<br /> # title<br /> A list¶<br /> If you pass in a list, Beautiful Soup will allow a string match<br /> against any item in that list. This code finds all the <a> tags<br /> and all the <b> tags:<br /> nd_all([“a”, “b”])<br /> # [<b>The Dormouse’s story</b>,<br /> # <a class="sister" href=" id="link1">Elsie</a>,<br /> True¶<br /> The value True matches everything it can. This code finds all<br /> the tags in the document, but none of the text strings:<br /> for tag in nd_all(True):<br /> # head<br /> # a<br /> A function¶<br /> If none of the other matches work for you, define a function that<br /> takes an element as its only argument. The function should return<br /> True if the argument matches, and False otherwise.<br /> Here’s a function that returns True if a tag defines the “class”<br /> attribute but doesn’t define the “id” attribute:<br /> def has_class_but_no_id(tag):<br /> return tag. has_attr(‘class’) and not tag. has_attr(‘id’)<br /> Pass this function into find_all() and you’ll pick up all the </p> <p> tags:<br /> nd_all(has_class_but_no_id)<br /> # [</p> <p class="title"><b>The Dormouse’s story</b></p> <p>,<br /> # </p> <p class="story">Once upon a time there were…bottom of a well. </p> <p>,<br /> # </p> <p class="story">… </p> <p>]<br /> This function only picks up the </p> <p> tags. It doesn’t pick up the <a><br /> tags, because those tags define both “class” and “id”. It doesn’t pick<br /> up tags like <html> and <title>, because those tags don’t define<br /> “class”.<br /> If you pass in a function to filter on a specific attribute like<br /> href, the argument passed into the function will be the attribute<br /> value, not the whole tag. Here’s a function that finds all a tags<br /> whose href attribute does not match a regular expression:<br /> def not_lacie(href):<br /> return href and not mpile(“lacie”)(href)<br /> nd_all(href=not_lacie)<br /> The function can be as complicated as you need it to be. Here’s a<br /> function that returns True if a tag is surrounded by string<br /> objects:<br /> from bs4 import NavigableString<br /> def surrounded_by_strings(tag):<br /> return (isinstance(xt_element, NavigableString)<br /> and isinstance(evious_element, NavigableString))<br /> for tag in nd_all(surrounded_by_strings):<br /> Now we’re ready to look at the search methods in detail.<br /> find_all()¶<br /> Method signature: find_all(name, attrs, recursive, string, limit, **kwargs)<br /> The find_all() method looks through a tag’s descendants and<br /> retrieves all descendants that match your filters. I gave several<br /> examples in Kinds of filters, but here are a few more:<br /> nd_all(“title”)<br /> nd_all(“p”, “title”)<br /> # [</p> <p class="title"><b>The Dormouse’s story</b></p> <p>]<br /> nd_all(“a”)<br /> nd_all(id=”link2″)<br /> # [<a class="sister" href=" id="link2">Lacie</a>]<br /> (mpile(“sisters”))<br /> Some of these should look familiar, but others are new. What does it<br /> mean to pass in a value for string, or id? Why does<br /> find_all(“p”, “title”) find a </p> <p> tag with the CSS class “title”?<br /> Let’s look at the arguments to find_all().<br /> The name argument¶<br /> Pass in a value for name and you’ll tell Beautiful Soup to only<br /> consider tags with certain names. Text strings will be ignored, as<br /> will tags whose names that don’t match.<br /> This is the simplest usage:<br /> Recall from Kinds of filters that the value to name can be a<br /> string, a regular expression, a list, a function, or the value<br /> True.<br /> The keyword arguments¶<br /> Any argument that’s not recognized will be turned into a filter on one<br /> of a tag’s attributes. If you pass in a value for an argument called id,<br /> Beautiful Soup will filter against each tag’s ‘id’ attribute:<br /> nd_all(id=’link2′)<br /> If you pass in a value for href, Beautiful Soup will filter<br /> against each tag’s ‘href’ attribute:<br /> nd_all(mpile(“elsie”))<br /> # [<a class="sister" href=" id="link1">Elsie</a>]<br /> You can filter an attribute based on a string, a regular<br /> expression, a list, a function, or the value True.<br /> This code finds all tags whose id attribute has a value,<br /> regardless of what the value is:<br /> nd_all(id=True)<br /> You can filter multiple attributes at once by passing in more than one<br /> keyword argument:<br /> nd_all(mpile(“elsie”), id=’link1′)<br /> Some attributes, like the data-* attributes in HTML 5, have names that<br /> can’t be used as the names of keyword arguments:<br /> data_soup = BeautifulSoup(‘</p> <div data-foo="value">foo! </div> <p>‘, ”)<br /> nd_all(data-foo=”value”)<br /> # SyntaxError: keyword can’t be an expression<br /> You can use these attributes in searches by putting them into a<br /> dictionary and passing the dictionary into find_all() as the<br /> attrs argument:<br /> nd_all(attrs={“data-foo”: “value”})<br /> # [</p> <div data-foo="value">foo! </div> <p>]<br /> You can’t use a keyword argument to search for HTML’s ‘name’ element,<br /> because Beautiful Soup uses the name argument to contain the name<br /> of the tag itself. Instead, you can give a value to ‘name’ in the<br /> name_soup = BeautifulSoup(‘<input name="email"/>‘, ”)<br /> nd_all(name=”email”)<br /> # []<br /> nd_all(attrs={“name”: “email”})<br /> # [<input name="email"/>]<br /> Searching by CSS class¶<br /> It’s very useful to search for a tag that has a certain CSS class, but<br /> the name of the CSS attribute, “class”, is a reserved word in<br /> Python. Using class as a keyword argument will give you a syntax<br /> error. As of Beautiful Soup 4. 1. 2, you can search by CSS class using<br /> the keyword argument class_:<br /> nd_all(“a”, class_=”sister”)<br /> As with any keyword argument, you can pass class_ a string, a regular<br /> expression, a function, or True:<br /> nd_all(mpile(“itl”))<br /> def has_six_characters(css_class):<br /> return css_class is not None and len(css_class) == 6<br /> nd_all(class_=has_six_characters)<br /> Remember that a single tag can have multiple<br /> values for its “class” attribute. When you search for a tag that<br /> matches a certain CSS class, you’re matching against any of its CSS<br /> classes:<br /> nd_all(“p”, class_=”strikeout”)<br /> # [</p> <p class="body strikeout"> <p>]<br /> nd_all(“p”, class_=”body”)<br /> You can also search for the exact string value of the class attribute:<br /> nd_all(“p”, class_=”body strikeout”)<br /> But searching for variants of the string value won’t work:<br /> nd_all(“p”, class_=”strikeout body”)<br /> If you want to search for tags that match two or more CSS classes, you<br /> should use a CSS selector:<br /> (“p. “)<br /> In older versions of Beautiful Soup, which don’t have the class_<br /> shortcut, you can use the attrs trick mentioned above. Create a<br /> dictionary whose value for “class” is the string (or regular<br /> expression, or whatever) you want to search for:<br /> nd_all(“a”, attrs={“class”: “sister”})<br /> The string argument¶<br /> With string you can search for strings instead of tags. As with<br /> name and the keyword arguments, you can pass in a string, a<br /> regular expression, a list, a function, or the value True.<br /> Here are some examples:<br /> nd_all(string=”Elsie”)<br /> # [‘Elsie’]<br /> nd_all(string=[“Tillie”, “Elsie”, “Lacie”])<br /> # [‘Elsie’, ‘Lacie’, ‘Tillie’]<br /> nd_all(mpile(“Dormouse”))<br /> # [“The Dormouse’s story”, “The Dormouse’s story”]<br /> def is_the_only_string_within_a_tag(s):<br /> “””Return True if this string is the only child of its parent tag. “””<br /> return (s ==)<br /> nd_all(string=is_the_only_string_within_a_tag)<br /> # [“The Dormouse’s story”, “The Dormouse’s story”, ‘Elsie’, ‘Lacie’, ‘Tillie’, ‘… ‘]<br /> Although string is for finding strings, you can combine it with<br /> arguments that find tags: Beautiful Soup will find all tags whose<br /> matches your value for string. This code finds the <a><br /> tags whose is “Elsie”:<br /> nd_all(“a”, string=”Elsie”)<br /> # [<a href=" class="sister" id="link1">Elsie</a>]<br /> The string argument is new in Beautiful Soup 4. 4. 0. In earlier<br /> versions it was called text:<br /> nd_all(“a”, text=”Elsie”)<br /> The limit argument¶<br /> find_all() returns all the tags and strings that match your<br /> filters. This can take a while if the document is large. If you don’t<br /> need all the results, you can pass in a number for limit. This<br /> works just like the LIMIT keyword in SQL. It tells Beautiful Soup to<br /> stop gathering results after it’s found a certain number.<br /> There are three links in the “three sisters” document, but this code<br /> only finds the first two:<br /> nd_all(“a”, limit=2)<br /> # <a class="sister" href=" id="link2">Lacie</a>]<br /> The recursive argument¶<br /> If you call nd_all(), Beautiful Soup will examine all the<br /> descendants of mytag: its children, its children’s children, and<br /> so on. If you only want Beautiful Soup to consider direct children,<br /> you can pass in recursive=False. See the differe<br /> <img decoding="async" src="https://bilderupload.net/wp-content/uploads/2021/11/puzzles-scenes-export-to-gltf.jpg" alt="Using BeautifulSoup to parse HTML and extract press ..." title="Using BeautifulSoup to parse HTML and extract press ..." /></p> <h2>Using BeautifulSoup to parse HTML and extract press …</h2> <p>A webpage is just a text file in HTML format. And HTML-formatted text is ultimately just text. So, let’s write our own HTML from scratch, without worrying yet about “the Web”:<br /> htmltxt = “</p> <p>Hello World</p> <p>”<br /> The point of HTML-parsing is to be able to efficiently extract the text values in an HTML document – e. g. Hello World – apart from the HTML markup – e. </p> <p>.<br /> We’ll start out by using Beautiful Soup, one of Python’s most popular HTML-parsing libraries.<br /> Importing the BeautifulSoup constructor function<br /> This is the standard import statement for using Beautiful Soup:<br /> from bs4 import BeautifulSoup<br /> The BeautifulSoup constructor function takes in two string arguments:<br /> The HTML string to be parsed.<br /> Optionally, the name of a parser. Without getting into the background of why there are multiple implementations of HTML parsing, for our purposes, we will always be using ‘lxml’.<br /> So, let’s parse some HTML:<br /> soup = BeautifulSoup(htmltxt, ‘lxml’)<br /> The “soup” object<br /> What is soup? As always, use the type() method to inspect an unknown object:<br /> type(soup)<br /> # autifulSoup<br /> OK, at least we know that soup is not just plain text. The more complicated answer is that soup is now an object with much more complexity and methods than just a Python string. However, this complexity is worth diving into, because the BeautifulSoup-type object has specific methods designed for efficiently working with HTML.<br /> The BeautifulSoup object has a text attribute that returns the plain text of a HTML string sans the tags. Given our simple soup of </p> <p>Hello World</p> <p>, the text attribute returns:<br /> # ‘Hello World’<br /> Let’s try a more complicated HTML string:<br /> soup = BeautifulSoup(“””</p> <h1>Hello</h1> <p>World</p> <p>“””, ‘lxml’)<br /> # ‘HelloWorld’<br /> And here’s a HTML string that contains a URL:<br /> mytxt = “””</p> <h1>Hello World</h1> <p>This is a <a href=">link</a></p> <p>“””<br /> soup = BeautifulSoup(mytxt, ‘lxml’)<br /> # ‘Hello World\nThis is a link’<br /> Basically, the BeautifulSoup’s text attribute will return a string stripped of any HTML tags and metadata.<br /> Generally, we don’t want to just spit all of the tag-stripped text of an HTML document. Usually, we want to extract text from just a few specific elements.<br /> Let’s re-use our “complicated” HTML string from above:</p> <p>This is a <a href=">link</a></p> <p>“””<br /> It contains 3 HTML tags:<br /> A headline, </p> <h1> A paragraph, </p> <p> Within that paragraph, a hyperlink, <a><br /> To find the first element by tag, we use the BeautifulSoup object’s find() method, which takes a tag’s name as the first argument:<br /> (‘a’)<br /> # <a href=">link</a><br /> Again, use type() to figure out what exactly is being returned:<br /> type((‘a’))<br /> #<br /> What’s the difference between a Tag and BeautifulSoup object? I don’t really know, but what’s important to us is their similarities. A Tag object also has a text attribute:<br /> # link<br /> Try find() with the other tags:<br /> (‘p’)<br /> # </p> <p>This is a <a href=">link</a></p> <p># ‘This is a link’<br /> For the White House press briefings – and other HTML-parsing exercises – we want more than just the rendered text of the HTML. We’ll want some of the meta attributes of the HTML, such as the href values for link tags.<br /> The Tag object has the attrs attribute, which returns a dictionary of key-value pairs. Let’s start from the top:<br /> mylink = (‘a’)<br /> To extract the value of the href attribute from the mylink object, use attrs:<br /> type()<br /> # dict<br /> # {‘href’: ”}<br /> [‘href’]<br /> # ”<br /> What about the other tags in our HTML snippet? They have no attributes and thus will have blank dictionaries for their attrs attributes:<br /> (‘h1’)<br /> # {}<br /> OK, let’s step up the complexity; what if there are multiple <a> tags from which we want to extract href and text values? We use the find_all() method which returns a collection of elements:<br /> moretxt = “””</p> <p>Visit the <a href=''>New York Times</a></p> <p>Visit the <a href=''>Wall Street Journal</a></p> <p>soup = BeautifulSoup(moretxt, ‘lxml’)<br /> tags = nd_all(‘a’)<br /> type(tags)<br /> # sultSet<br /> A ResultSet acts very much like other kinds of Python sequence, such as a list:<br /> len(tags)<br /> # 2<br /> tags[0]<br /> # New York Times<br /> tags[0][‘href’]<br /> ”<br /> for t in tags:<br /> print(, [‘href’])<br /> # New York Times # Wall Street Journal<br /> However, be careful not to treat the ResultSet as if it were a Tag – try to understand why the following doesn’t make much sense (nevermind results in an error):<br /> # AttributeError: ‘ResultSet’ object has no attribute ‘attrs’<br /> The HTML attributes exist at a per-tag level – what would you expect it to return for a collection of tags? The designer of BeautifulSoup has no idea, thus, the error message.<br /> If what you want is the href value for each of the tags, then you have to do it the old fashioned way with a for-loop:<br /> hrefs = []<br /> (t)<br /> What happens when there is more than one “group” of link tags that we want? In the snippet below, the <a> tags we care about are nested within </p> <h1> tags:<br /> evenmoretxt = “””</p> <h1><a href=">Awesome</a></h1> <h1><a href=">Really Awesome</a></h1> <div><a href=">Ignore me</a></div> <div><a href=">Ignore me again</a></div> <p>soup = BeautifulSoup(evenmoretxt, ‘lxml’)<br /> First, we can collect all of the </p> <h1> tags using find_all():<br /> heds = nd_all(‘h1’)<br /> Each of the members of heds is a Tag object, and each Tag object has a find() method, which we can use to select just the nested <a> tag:<br /> links = []<br /> for h in heds:<br /> a = (‘a’)<br /> (a)<br /> Or, more concisely:<br /> ((‘a’))<br /> Parsing our own hand-constructed HTML is not much fun. So let’s get a “real” HTML document from the web.<br /> This part should be familiar:<br /> import requests<br /> resp = (”)<br /> txt =<br /> Whether the contents of txt is a hand-constructed string or something that came from the Web doesn’t matter when we’re working with Beautiful Soup – we only care about converting a string into a BeautifulSoup object:<br /> soup = BeautifulSoup(txt, ‘lxml’)<br /> Look at the webpage at. Inspect its source. Then see if you can write the Python code that extracts:<br /> The number of </p> <p> tags.<br /> The text in the first </p> <p> tag<br /> The length of the text of the first </p> <h1> tag<br /> The text of the first (and only) <a> tag<br /> The href of the first <a> tag<br /> My answers below:<br /> # 1. The number of `</p> <p>` tags.<br /> len(nd_all(‘p’))<br /> # 2. The text in the first `</p> <p>` tag<br /> nd_all(‘p’)[0]<br /> # 3. The length of the text of the first `</p> <h1>` tag<br /> len((‘h1’))<br /> # 4. The text of the first `<a>` tag<br /> # 5. The href of the first `<a>` tag<br /> (‘a’)[‘href’]<br /> Now see if you can extract each press briefing URL from this sample White House press briefings page:<br /> Examining the source HTML behind each press release tag<br /> Let’s look at that first URL.<br /> Its text is:<br /> Press Briefing by Press Secretary Jay Carney, 12/6/2013<br /> Its href value is:<br /> If you inspect the source and search for the specific tag, you’ll find this HTML:</p> <div class="views-field views-field-title"> <h3 class="field-content"><a href=">Press Briefing by Press Secretary Jay Carney, 12/6/2013</a></h3> </p></div> </p></div> <p>For this page, a link is more than just an <a> tag; it’s nested within several other tags. Here’s a pretty-formatted version of that one link and its parent tags:</p> <div class="views-field views-field-title"> <h3 class="field-content"> <a href="><br /> </a><br /> </h3> </div> <p>Processing the press briefings page as soup<br /> Let’s turn this convoluted HTML into soup. See if you can remember the steps for downloading the webpage and converting it to a soup object well enough to type them by memory:<br /> url = ”<br /> resp = (url)<br /> soup = BeautifulSoup(, ‘lxml’)<br /> There are 10 press briefings per page, but it should be evident that there are more than 10 link tags. That’s easy enough to find out:<br /> len(nd_all(‘a’))<br /> # 263<br /> So how do we get just the URLs for the actual press briefings? From the HTML that we inspected previously, we want <a> tags that are nested within </p> <h3> tags.<br /> So let’s find and count the number of h3 tags:<br /> len(nd_all(‘h3’))<br /> # 10<br /> Hey, what a coincidence – there are exactly as many h3 tags as links to press briefings. This is just a lucky result of how the White House webdevs decided to build this page.<br /> Here’s one way to extract all the URLs of the nested link tags into a list:<br /> urls = []<br /> for h in nd_all(‘h3’):<br /> ([‘href’])<br /> Here’s a more concise – albeit harder to read – version:<br /> ((‘a’)[‘href’])<br /> Either way, this is the contents of urls:<br /> [”,<br /> ”,<br /> ”]<br /> To extract the URLs from the canned sample webpage, here’s all the code:<br /> Now all we have to do is repeat this for every page of press briefings listings…</p> <h2>Frequently Asked Questions about parsing html with beautifulsoup</h2> <h3>What does BeautifulSoup return?</h3> <p>Basically, the BeautifulSoup ‘s text attribute will return a string stripped of any HTML tags and metadata.</p> <h3>How do I parse a website using BeautifulSoup?</h3> <p>First, we need to import all the libraries that we are going to use. Next, declare a variable for the url of the page. Then, make use of the Python urllib2 to get the HTML page of the url declared. Finally, parse the page into BeautifulSoup format so we can use BeautifulSoup to work on it.Jun 10, 2017</p> <h3>How do I open a HTML file in BeautifulSoup?</h3> <p>Python: Parse an Html File Using Beautifulsoupfrom bs4 import BeautifulSoup with open(‘files/file1.html’) as f: #read File content = f. read() #parse HTML soup = BeautifulSoup(content, ‘html.parser’) #print Title tag print(soup. … f = open(‘file.html’) content = f. … import glob files = glob.Apr 28, 2021</p> </div><!-- .entry-content --> <footer class="entry-footer"> <span class="cat-links"><a href="https://bilderupload.net/category/proxy/" rel="category tag">Proxy</a></span><span class="tags-links">Tags : <a href="https://bilderupload.net/tag/beautifulsoup-documentation/" rel="tag">beautifulsoup documentation</a>, <a href="https://bilderupload.net/tag/beautifulsoup-example/" rel="tag">beautifulsoup example</a>, <a href="https://bilderupload.net/tag/beautifulsoup-find-by-class/" rel="tag">beautifulsoup find by class</a>, <a href="https://bilderupload.net/tag/beautifulsoup-find_all/" rel="tag">beautifulsoup find_all</a>, <a href="https://bilderupload.net/tag/beautifulsoup-get-text-inside-tag/" rel="tag">beautifulsoup get text inside tag</a>, <a href="https://bilderupload.net/tag/beautifulsoup-html-parser-vs-lxml/" rel="tag">beautifulsoup html.parser vs lxml</a>, <a href="https://bilderupload.net/tag/html-parser-python/" rel="tag">html parser python</a>, <a href="https://bilderupload.net/tag/install-beautifulsoup/" rel="tag">install beautifulsoup</a></span><span class="comments-link"><a href="https://bilderupload.net/parsing-html-with-beautifulsoup/#respond">Leave a Comment<span class="screen-reader-text"> on Parsing Html With Beautifulsoup</span></a></span> </footer><!-- .entry-footer --> </div><!-- .blog-post-item --> </article><!-- #post-14021 --> <nav class="navigation post-navigation" aria-label="Posts"> <h2 class="screen-reader-text">Post navigation</h2> <div class="nav-links"><div class="nav-previous"><a href="https://bilderupload.net/piratebay-proxy-2021/" rel="prev"><span class="screen-reader-text">Previous Post</span><span aria-hidden="true" class="nav-subtitle">Previous</span> <span class="nav-title"><span class="nav-title-icon-wrapper"><svg class="icon icon-arrow-left" aria-hidden="true" role="img"> <use href="#icon-arrow-left" xlink:href="#icon-arrow-left"></use> </svg></span>Piratebay Proxy 2021</span></a></div><div class="nav-next"><a href="https://bilderupload.net/proxifier-%d1%81%d0%ba%d0%b0%d1%87%d0%b0%d1%82%d1%8c-%d0%b1%d0%b5%d1%81%d0%bf%d0%bb%d0%b0%d1%82%d0%bd%d0%be-%d0%bd%d0%b0-%d1%80%d1%83%d1%81%d1%81%d0%ba%d0%be%d0%bc-%d1%81-%d0%ba%d1%80%d1%8f%d0%ba/" rel="next"><span class="screen-reader-text">Next Post</span><span aria-hidden="true" class="nav-subtitle">Next</span> <span class="nav-title">Proxifier Скачать Бесплатно На Русском С Кряком<span class="nav-title-icon-wrapper"><svg class="icon icon-arrow-right" aria-hidden="true" role="img"> <use href="#icon-arrow-right" xlink:href="#icon-arrow-right"></use> </svg></span></span></a></div></div> </nav> <div id="comments" class="comments-area"> <div id="respond" class="comment-respond"> <h3 id="reply-title" class="comment-reply-title">Leave a Reply <small><a rel="nofollow" id="cancel-comment-reply-link" href="/parsing-html-with-beautifulsoup/#respond" style="display:none;">Cancel reply</a></small></h3><form action="https://bilderupload.net/wp-comments-post.php" method="post" id="commentform" class="comment-form" novalidate><p class="comment-notes"><span id="email-notes">Your email address will not be published.</span> <span class="required-field-message">Required fields are marked <span class="required">*</span></span></p><p class="comment-form-comment"><label for="comment">Comment <span class="required">*</span></label> <textarea id="comment" name="comment" cols="45" rows="8" maxlength="65525" required></textarea></p><p class="comment-form-author"><label for="author">Name <span class="required">*</span></label> <input id="author" name="author" type="text" value="" size="30" maxlength="245" autocomplete="name" required /></p> <p class="comment-form-email"><label for="email">Email <span class="required">*</span></label> <input id="email" name="email" type="email" value="" size="30" maxlength="100" aria-describedby="email-notes" autocomplete="email" required /></p> <p class="comment-form-url"><label for="url">Website</label> <input id="url" name="url" type="url" value="" size="30" maxlength="200" autocomplete="url" /></p> <p class="comment-form-cookies-consent"><input id="wp-comment-cookies-consent" name="wp-comment-cookies-consent" type="checkbox" value="yes" /> <label for="wp-comment-cookies-consent">Save my name, email, and website in this browser for the next time I comment.</label></p> <p class="form-submit"><input name="submit" type="submit" id="submit" class="submit" value="Post Comment" /> <input type='hidden' name='comment_post_ID' value='14021' id='comment_post_ID' /> <input type='hidden' name='comment_parent' id='comment_parent' value='0' /> </p><p style="display: none;"><input type="hidden" id="akismet_comment_nonce" name="akismet_comment_nonce" value="0784408137" /></p><p style="display: none !important;" class="akismet-fields-container" data-prefix="ak_"><label>Δ<textarea name="ak_hp_textarea" cols="45" rows="8" maxlength="100"></textarea></label><input type="hidden" id="ak_js_1" name="ak_js" value="76"/></p></form> </div><!-- #respond --> </div><!-- #comments --> </div><!-- .single-post-wrap --> </main><!-- #main --> </div><!-- #primary --> <aside id="secondary" class="widget-area"> <section id="recent-posts-2" class="widget widget_recent_entries"> <h2 class="widget-title">Recent Posts</h2> <ul> <li> <a href="https://bilderupload.net/simple-web-crawler/">Simple Web Crawler</a> </li> <li> <a href="https://bilderupload.net/web-crawler-app/">Web Crawler App</a> </li> <li> <a href="https://bilderupload.net/pirate-by-proxy/">Pirate By Proxy</a> </li> <li> <a href="https://bilderupload.net/bayproxy-eu/">Bayproxy Eu</a> </li> <li> <a href="https://bilderupload.net/how-to-setup-socks5-proxy-firefox/">How To Setup Socks5 Proxy Firefox</a> </li> <li> <a href="https://bilderupload.net/nsi-proxy-service-driver/">Nsi Proxy Service Driver</a> </li> <li> <a href="https://bilderupload.net/how-to-unblock-a-contact-on-skype/">How To Unblock A Contact On Skype</a> </li> <li> <a href="https://bilderupload.net/us-ip-address-list/">Us Ip Address List</a> </li> <li> <a href="https://bilderupload.net/my-ip-shopify/">My Ip Shopify</a> </li> <li> <a href="https://bilderupload.net/how-to-use-vpn-for-cheaper-flights/">How To Use Vpn For Cheaper Flights</a> </li> <li> <a href="https://bilderupload.net/how-to-make-bit-torrent-faster/">How To Make Bit Torrent Faster</a> </li> <li> <a href="https://bilderupload.net/vpn-vs/">Vpn Vs</a> </li> <li> <a href="https://bilderupload.net/craigslist-menu/">Craigslist Menu</a> </li> <li> <a href="https://bilderupload.net/how-to-set-up-multiple-instagram-accounts/">How To Set Up Multiple Instagram Accounts</a> </li> <li> <a href="https://bilderupload.net/facebook-unblocked-at-school-site/">Facebook Unblocked At School Site</a> </li> <li> <a href="https://bilderupload.net/youtube-at-school-proxy/">Youtube At School Proxy</a> </li> <li> <a href="https://bilderupload.net/what-does-availability-mean-in-utorrent/">What Does Availability Mean In Utorrent</a> </li> <li> <a href="https://bilderupload.net/requests-python-example/">Requests Python Example</a> </li> <li> <a href="https://bilderupload.net/can-a-vpn-get-around-net-neutrality/">Can A Vpn Get Around Net Neutrality</a> </li> <li> <a href="https://bilderupload.net/xml-parse-python/">Xml Parse Python</a> </li> </ul> </section><section id="tag_cloud-2" class="widget widget_tag_cloud"><h2 class="widget-title">Tags</h2><div class="tagcloud"><a href="https://bilderupload.net/tag/1-million-proxy-list/" class="tag-cloud-link tag-link-945 tag-link-position-1" style="font-size: 8pt;" aria-label="1 million proxy list (27 items)">1 million proxy list</a> <a href="https://bilderupload.net/tag/best-free-proxy/" class="tag-cloud-link tag-link-580 tag-link-position-2" style="font-size: 19.666666666667pt;" aria-label="best free proxy (99 items)">best free proxy</a> <a href="https://bilderupload.net/tag/best-free-proxy-server-list/" class="tag-cloud-link tag-link-947 tag-link-position-3" style="font-size: 16.909090909091pt;" aria-label="best free proxy server list (73 items)">best free proxy server list</a> <a href="https://bilderupload.net/tag/best-proxy-server/" class="tag-cloud-link tag-link-579 tag-link-position-4" style="font-size: 11.181818181818pt;" aria-label="best proxy server (39 items)">best proxy server</a> <a href="https://bilderupload.net/tag/craigslist-account-for-sale/" class="tag-cloud-link tag-link-1376 tag-link-position-5" style="font-size: 8pt;" aria-label="craigslist account for sale (27 items)">craigslist account for sale</a> <a href="https://bilderupload.net/tag/craigslist-homepage/" class="tag-cloud-link tag-link-1375 tag-link-position-6" style="font-size: 10.969696969697pt;" aria-label="craigslist homepage (38 items)">craigslist homepage</a> <a href="https://bilderupload.net/tag/fastest-proxy-list/" class="tag-cloud-link tag-link-1503 tag-link-position-7" style="font-size: 8pt;" aria-label="fastest proxy list (27 items)">fastest proxy list</a> <a href="https://bilderupload.net/tag/free-proxy/" class="tag-cloud-link tag-link-435 tag-link-position-8" style="font-size: 12.030303030303pt;" aria-label="free proxy (43 items)">free proxy</a> <a href="https://bilderupload.net/tag/free-proxy-list/" class="tag-cloud-link tag-link-219 tag-link-position-9" style="font-size: 21.151515151515pt;" aria-label="free proxy list (116 items)">free proxy list</a> <a href="https://bilderupload.net/tag/free-proxy-list-download/" class="tag-cloud-link tag-link-1505 tag-link-position-10" style="font-size: 10.545454545455pt;" aria-label="free proxy list download (36 items)">free proxy list download</a> <a href="https://bilderupload.net/tag/free-proxy-list-india/" class="tag-cloud-link tag-link-3581 tag-link-position-11" style="font-size: 8.2121212121212pt;" aria-label="free proxy list india (28 items)">free proxy list india</a> <a href="https://bilderupload.net/tag/free-proxy-list-txt/" class="tag-cloud-link tag-link-1504 tag-link-position-12" style="font-size: 11.818181818182pt;" aria-label="free proxy list txt (42 items)">free proxy list txt</a> <a href="https://bilderupload.net/tag/free-proxy-list-usa/" class="tag-cloud-link tag-link-2648 tag-link-position-13" style="font-size: 8.2121212121212pt;" aria-label="free proxy list usa (28 items)">free proxy list usa</a> <a href="https://bilderupload.net/tag/free-proxy-server/" class="tag-cloud-link tag-link-136 tag-link-position-14" style="font-size: 13.090909090909pt;" aria-label="free proxy server (48 items)">free proxy server</a> <a href="https://bilderupload.net/tag/free-proxy-server-list/" class="tag-cloud-link tag-link-449 tag-link-position-15" style="font-size: 17.545454545455pt;" aria-label="free proxy server list (79 items)">free proxy server list</a> <a href="https://bilderupload.net/tag/free-socks-list-daily/" class="tag-cloud-link tag-link-949 tag-link-position-16" style="font-size: 11.818181818182pt;" aria-label="free socks list daily (42 items)">free socks list daily</a> <a href="https://bilderupload.net/tag/free-vpn-for-netflix/" class="tag-cloud-link tag-link-384 tag-link-position-17" style="font-size: 10.333333333333pt;" aria-label="free vpn for netflix (35 items)">free vpn for netflix</a> <a href="https://bilderupload.net/tag/free-vpn-to-hide-ip-address/" class="tag-cloud-link tag-link-14 tag-link-position-18" style="font-size: 15.212121212121pt;" aria-label="free vpn to hide ip address (60 items)">free vpn to hide ip address</a> <a href="https://bilderupload.net/tag/free-web-proxy/" class="tag-cloud-link tag-link-885 tag-link-position-19" style="font-size: 8.8484848484848pt;" aria-label="free web proxy (30 items)">free web proxy</a> <a href="https://bilderupload.net/tag/fresh-unblocked-proxy-sites-2020/" class="tag-cloud-link tag-link-1618 tag-link-position-20" style="font-size: 9.4848484848485pt;" aria-label="fresh unblocked proxy sites 2020 (32 items)">fresh unblocked proxy sites 2020</a> <a href="https://bilderupload.net/tag/hide-my-ip-address-free/" class="tag-cloud-link tag-link-11 tag-link-position-21" style="font-size: 13.727272727273pt;" aria-label="hide my ip address free (51 items)">hide my ip address free</a> <a href="https://bilderupload.net/tag/hide-my-ip-address-free-online/" class="tag-cloud-link tag-link-13 tag-link-position-22" style="font-size: 15.212121212121pt;" aria-label="hide my ip address free online (60 items)">hide my ip address free online</a> <a href="https://bilderupload.net/tag/hide-my-ip-online/" class="tag-cloud-link tag-link-1569 tag-link-position-23" style="font-size: 10.545454545455pt;" aria-label="hide my ip online (36 items)">hide my ip online</a> <a href="https://bilderupload.net/tag/how-to-hide-ip-address-on-android/" class="tag-cloud-link tag-link-1568 tag-link-position-24" style="font-size: 9.9090909090909pt;" aria-label="how to hide ip address on android (34 items)">how to hide ip address on android</a> <a href="https://bilderupload.net/tag/how-to-hide-my-ip-address-in-gmail/" class="tag-cloud-link tag-link-15 tag-link-position-25" style="font-size: 10.969696969697pt;" aria-label="how to hide my ip address in gmail (38 items)">how to hide my ip address in gmail</a> <a href="https://bilderupload.net/tag/how-to-hide-my-ip-address-without-vpn/" class="tag-cloud-link tag-link-12 tag-link-position-26" style="font-size: 14.787878787879pt;" aria-label="how to hide my ip address without vpn (58 items)">how to hide my ip address without vpn</a> <a href="https://bilderupload.net/tag/ip-address-tracker/" class="tag-cloud-link tag-link-2387 tag-link-position-27" style="font-size: 12.454545454545pt;" aria-label="ip address tracker (45 items)">ip address tracker</a> <a href="https://bilderupload.net/tag/my-ip-country/" class="tag-cloud-link tag-link-3401 tag-link-position-28" style="font-size: 10.757575757576pt;" aria-label="my ip country (37 items)">my ip country</a> <a href="https://bilderupload.net/tag/proxy-browser/" class="tag-cloud-link tag-link-577 tag-link-position-29" style="font-size: 11.606060606061pt;" aria-label="proxy browser (41 items)">proxy browser</a> <a href="https://bilderupload.net/tag/proxy-free/" class="tag-cloud-link tag-link-506 tag-link-position-30" style="font-size: 9.4848484848485pt;" aria-label="proxy free (32 items)">proxy free</a> <a href="https://bilderupload.net/tag/proxy-server/" class="tag-cloud-link tag-link-439 tag-link-position-31" style="font-size: 16.69696969697pt;" aria-label="proxy server (72 items)">proxy server</a> <a href="https://bilderupload.net/tag/proxy-server-address/" class="tag-cloud-link tag-link-581 tag-link-position-32" style="font-size: 13.939393939394pt;" aria-label="proxy server address (53 items)">proxy server address</a> <a href="https://bilderupload.net/tag/proxy-site/" class="tag-cloud-link tag-link-504 tag-link-position-33" style="font-size: 13.30303030303pt;" aria-label="proxy site (49 items)">proxy site</a> <a href="https://bilderupload.net/tag/proxy-url-list/" class="tag-cloud-link tag-link-222 tag-link-position-34" style="font-size: 9.2727272727273pt;" aria-label="proxy url list (31 items)">proxy url list</a> <a href="https://bilderupload.net/tag/proxy-websites/" class="tag-cloud-link tag-link-857 tag-link-position-35" style="font-size: 15.636363636364pt;" aria-label="proxy websites (63 items)">proxy websites</a> <a href="https://bilderupload.net/tag/socks5-proxy-list/" class="tag-cloud-link tag-link-69 tag-link-position-36" style="font-size: 8.2121212121212pt;" aria-label="socks5 proxy list (28 items)">socks5 proxy list</a> <a href="https://bilderupload.net/tag/socks5-proxy-list-txt/" class="tag-cloud-link tag-link-445 tag-link-position-37" style="font-size: 11.181818181818pt;" aria-label="socks5 proxy list txt (39 items)">socks5 proxy list txt</a> <a href="https://bilderupload.net/tag/thepiratebay3-list/" class="tag-cloud-link tag-link-232 tag-link-position-38" style="font-size: 8.8484848484848pt;" aria-label="thepiratebay3 list (30 items)">thepiratebay3 list</a> <a href="https://bilderupload.net/tag/unblock-proxy-free/" class="tag-cloud-link tag-link-128 tag-link-position-39" style="font-size: 22pt;" aria-label="unblock proxy free (128 items)">unblock proxy free</a> <a href="https://bilderupload.net/tag/unblocktheship-proxy/" class="tag-cloud-link tag-link-235 tag-link-position-40" style="font-size: 8.2121212121212pt;" aria-label="unblocktheship proxy (28 items)">unblocktheship proxy</a> <a href="https://bilderupload.net/tag/utorrent-download/" class="tag-cloud-link tag-link-598 tag-link-position-41" style="font-size: 8.8484848484848pt;" aria-label="utorrent download (30 items)">utorrent download</a> <a href="https://bilderupload.net/tag/vpn-proxy/" class="tag-cloud-link tag-link-858 tag-link-position-42" style="font-size: 11.606060606061pt;" aria-label="vpn proxy (41 items)">vpn proxy</a> <a href="https://bilderupload.net/tag/what-is-a-proxy-server/" class="tag-cloud-link tag-link-3124 tag-link-position-43" style="font-size: 8.2121212121212pt;" aria-label="what is a proxy server (28 items)">what is a proxy server</a> <a href="https://bilderupload.net/tag/what-is-my-ip/" class="tag-cloud-link tag-link-18 tag-link-position-44" style="font-size: 8.6363636363636pt;" aria-label="what is my ip (29 items)">what is my ip</a> <a href="https://bilderupload.net/tag/what-is-my-private-ip/" class="tag-cloud-link tag-link-3403 tag-link-position-45" style="font-size: 9.6969696969697pt;" aria-label="what is my private ip (33 items)">what is my private ip</a></div> </section></aside><!-- #secondary --> </div><!-- .container --> </div><!-- #content --> <footer id="colophon" class="site-footer"> <div id="footer-widgets" class="container"> <aside class="widget-area" role="complementary" aria-label="Footer"> <div class="widget-column footer-widget-3"> <section id="custom_html-2" class="widget_text widget widget_custom_html"><div class="textwidget custom-html-widget"><!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-S1X842WQMK"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-S1X842WQMK'); </script></div></section> </div> </aside><!-- .widget-area --> </div><!-- .container --> <div class="site-info"> <div class="container"> Theme Blog Tales by <a target="_blank" rel="designer" href="https://kantipurthemes.com/">Kantipur Themes</a> </div><!-- .container --> </div><!-- .site-info --> </footer><!-- #colophon --> <a href="#page" class="to-top"></a> </div><!-- #page --> <!--noptimize--><script>!function(){window.advanced_ads_ready_queue=window.advanced_ads_ready_queue||[],advanced_ads_ready_queue.push=window.advanced_ads_ready;for(var d=0,a=advanced_ads_ready_queue.length;d<a;d++)advanced_ads_ready(advanced_ads_ready_queue[d])}();</script><!--/noptimize--><svg style="position: absolute; width: 0; height: 0; overflow: hidden;" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink"> <defs> <symbol id="icon-behance" viewBox="0 0 37 32"> <path class="path1" d="M33 6.054h-9.125v2.214h9.125v-2.214zM28.5 13.661q-1.607 0-2.607 0.938t-1.107 2.545h7.286q-0.321-3.482-3.571-3.482zM28.786 24.107q1.125 0 2.179-0.571t1.357-1.554h3.946q-1.786 5.482-7.625 5.482-3.821 0-6.080-2.357t-2.259-6.196q0-3.714 2.33-6.17t6.009-2.455q2.464 0 4.295 1.214t2.732 3.196 0.902 4.429q0 0.304-0.036 0.839h-11.75q0 1.982 1.027 3.063t2.973 1.080zM4.946 23.214h5.286q3.661 0 3.661-2.982 0-3.214-3.554-3.214h-5.393v6.196zM4.946 13.625h5.018q1.393 0 2.205-0.652t0.813-2.027q0-2.571-3.393-2.571h-4.643v5.25zM0 4.536h10.607q1.554 0 2.768 0.25t2.259 0.848 1.607 1.723 0.563 2.75q0 3.232-3.071 4.696 2.036 0.571 3.071 2.054t1.036 3.643q0 1.339-0.438 2.438t-1.179 1.848-1.759 1.268-2.161 0.75-2.393 0.232h-10.911v-22.5z"></path> </symbol> <symbol id="icon-deviantart" viewBox="0 0 18 32"> <path class="path1" d="M18.286 5.411l-5.411 10.393 0.429 0.554h4.982v7.411h-9.054l-0.786 0.536-2.536 4.875-0.536 0.536h-5.375v-5.411l5.411-10.411-0.429-0.536h-4.982v-7.411h9.054l0.786-0.536 2.536-4.875 0.536-0.536h5.375v5.411z"></path> </symbol> <symbol id="icon-medium" viewBox="0 0 32 32"> <path class="path1" d="M10.661 7.518v20.946q0 0.446-0.223 0.759t-0.652 0.313q-0.304 0-0.589-0.143l-8.304-4.161q-0.375-0.179-0.634-0.598t-0.259-0.83v-20.357q0-0.357 0.179-0.607t0.518-0.25q0.25 0 0.786 0.268l9.125 4.571q0.054 0.054 0.054 0.089zM11.804 9.321l9.536 15.464-9.536-4.75v-10.714zM32 9.643v18.821q0 0.446-0.25 0.723t-0.679 0.277-0.839-0.232l-7.875-3.929zM31.946 7.5q0 0.054-4.58 7.491t-5.366 8.705l-6.964-11.321 5.786-9.411q0.304-0.5 0.929-0.5 0.25 0 0.464 0.107l9.661 4.821q0.071 0.036 0.071 0.107z"></path> </symbol> <symbol id="icon-slideshare" viewBox="0 0 32 32"> <path class="path1" d="M15.589 13.214q0 1.482-1.134 2.545t-2.723 1.063-2.723-1.063-1.134-2.545q0-1.5 1.134-2.554t2.723-1.054 2.723 1.054 1.134 2.554zM24.554 13.214q0 1.482-1.125 2.545t-2.732 1.063q-1.589 0-2.723-1.063t-1.134-2.545q0-1.5 1.134-2.554t2.723-1.054q1.607 0 2.732 1.054t1.125 2.554zM28.571 16.429v-11.911q0-1.554-0.571-2.205t-1.982-0.652h-19.857q-1.482 0-2.009 0.607t-0.527 2.25v12.018q0.768 0.411 1.58 0.714t1.446 0.5 1.446 0.33 1.268 0.196 1.25 0.071 1.045 0.009 1.009-0.036 0.795-0.036q1.214-0.018 1.696 0.482 0.107 0.107 0.179 0.161 0.464 0.446 1.089 0.911 0.125-1.625 2.107-1.554 0.089 0 0.652 0.027t0.768 0.036 0.813 0.018 0.946-0.018 0.973-0.080 1.089-0.152 1.107-0.241 1.196-0.348 1.205-0.482 1.286-0.616zM31.482 16.339q-2.161 2.661-6.643 4.5 1.5 5.089-0.411 8.304-1.179 2.018-3.268 2.643-1.857 0.571-3.25-0.268-1.536-0.911-1.464-2.929l-0.018-5.821v-0.018q-0.143-0.036-0.438-0.107t-0.42-0.089l-0.018 6.036q0.071 2.036-1.482 2.929-1.411 0.839-3.268 0.268-2.089-0.643-3.25-2.679-1.875-3.214-0.393-8.268-4.482-1.839-6.643-4.5-0.446-0.661-0.071-1.125t1.071 0.018q0.054 0.036 0.196 0.125t0.196 0.143v-12.393q0-1.286 0.839-2.196t2.036-0.911h22.446q1.196 0 2.036 0.911t0.839 2.196v12.393l0.375-0.268q0.696-0.482 1.071-0.018t-0.071 1.125z"></path> </symbol> <symbol id="icon-snapchat-ghost" viewBox="0 0 30 32"> <path class="path1" d="M15.143 2.286q2.393-0.018 4.295 1.223t2.92 3.438q0.482 1.036 0.482 3.196 0 0.839-0.161 3.411 0.25 0.125 0.5 0.125 0.321 0 0.911-0.241t0.911-0.241q0.518 0 1 0.321t0.482 0.821q0 0.571-0.563 0.964t-1.232 0.563-1.232 0.518-0.563 0.848q0 0.268 0.214 0.768 0.661 1.464 1.83 2.679t2.58 1.804q0.5 0.214 1.429 0.411 0.5 0.107 0.5 0.625 0 1.25-3.911 1.839-0.125 0.196-0.196 0.696t-0.25 0.83-0.589 0.33q-0.357 0-1.107-0.116t-1.143-0.116q-0.661 0-1.107 0.089-0.571 0.089-1.125 0.402t-1.036 0.679-1.036 0.723-1.357 0.598-1.768 0.241q-0.929 0-1.723-0.241t-1.339-0.598-1.027-0.723-1.036-0.679-1.107-0.402q-0.464-0.089-1.125-0.089-0.429 0-1.17 0.134t-1.045 0.134q-0.446 0-0.625-0.33t-0.25-0.848-0.196-0.714q-3.911-0.589-3.911-1.839 0-0.518 0.5-0.625 0.929-0.196 1.429-0.411 1.393-0.571 2.58-1.804t1.83-2.679q0.214-0.5 0.214-0.768 0-0.5-0.563-0.848t-1.241-0.527-1.241-0.563-0.563-0.938q0-0.482 0.464-0.813t0.982-0.33q0.268 0 0.857 0.232t0.946 0.232q0.321 0 0.571-0.125-0.161-2.536-0.161-3.393 0-2.179 0.482-3.214 1.143-2.446 3.071-3.536t4.714-1.125z"></path> </symbol> <symbol id="icon-yelp" viewBox="0 0 27 32"> <path class="path1" d="M13.804 23.554v2.268q-0.018 5.214-0.107 5.446-0.214 0.571-0.911 0.714-0.964 0.161-3.241-0.679t-2.902-1.589q-0.232-0.268-0.304-0.643-0.018-0.214 0.071-0.464 0.071-0.179 0.607-0.839t3.232-3.857q0.018 0 1.071-1.25 0.268-0.339 0.705-0.438t0.884 0.063q0.429 0.179 0.67 0.518t0.223 0.75zM11.143 19.071q-0.054 0.982-0.929 1.25l-2.143 0.696q-4.911 1.571-5.214 1.571-0.625-0.036-0.964-0.643-0.214-0.446-0.304-1.339-0.143-1.357 0.018-2.973t0.536-2.223 1-0.571q0.232 0 3.607 1.375 1.25 0.518 2.054 0.839l1.5 0.607q0.411 0.161 0.634 0.545t0.205 0.866zM25.893 24.375q-0.125 0.964-1.634 2.875t-2.42 2.268q-0.661 0.25-1.125-0.125-0.25-0.179-3.286-5.125l-0.839-1.375q-0.25-0.375-0.205-0.821t0.348-0.821q0.625-0.768 1.482-0.464 0.018 0.018 2.125 0.714 3.625 1.179 4.321 1.42t0.839 0.366q0.5 0.393 0.393 1.089zM13.893 13.089q0.089 1.821-0.964 2.179-1.036 0.304-2.036-1.268l-6.75-10.679q-0.143-0.625 0.339-1.107 0.732-0.768 3.705-1.598t4.009-0.563q0.714 0.179 0.875 0.804 0.054 0.321 0.393 5.455t0.429 6.777zM25.714 15.018q0.054 0.696-0.464 1.054-0.268 0.179-5.875 1.536-1.196 0.268-1.625 0.411l0.018-0.036q-0.411 0.107-0.821-0.071t-0.661-0.571q-0.536-0.839 0-1.554 0.018-0.018 1.339-1.821 2.232-3.054 2.679-3.643t0.607-0.696q0.5-0.339 1.161-0.036 0.857 0.411 2.196 2.384t1.446 2.991v0.054z"></path> </symbol> <symbol id="icon-vine" viewBox="0 0 27 32"> <path class="path1" d="M26.732 14.768v3.536q-1.804 0.411-3.536 0.411-1.161 2.429-2.955 4.839t-3.241 3.848-2.286 1.902q-1.429 0.804-2.893-0.054-0.5-0.304-1.080-0.777t-1.518-1.491-1.83-2.295-1.92-3.286-1.884-4.357-1.634-5.616-1.259-6.964h5.054q0.464 3.893 1.25 7.116t1.866 5.661 2.17 4.205 2.5 3.482q3.018-3.018 5.125-7.25-2.536-1.286-3.982-3.929t-1.446-5.946q0-3.429 1.857-5.616t5.071-2.188q3.179 0 4.875 1.884t1.696 5.313q0 2.839-1.036 5.107-0.125 0.018-0.348 0.054t-0.821 0.036-1.125-0.107-1.107-0.455-0.902-0.92q0.554-1.839 0.554-3.286 0-1.554-0.518-2.357t-1.411-0.804q-0.946 0-1.518 0.884t-0.571 2.509q0 3.321 1.875 5.241t4.768 1.92q1.107 0 2.161-0.25z"></path> </symbol> <symbol id="icon-vk" viewBox="0 0 35 32"> <path class="path1" d="M34.232 9.286q0.411 1.143-2.679 5.25-0.429 0.571-1.161 1.518-1.393 1.786-1.607 2.339-0.304 0.732 0.25 1.446 0.304 0.375 1.446 1.464h0.018l0.071 0.071q2.518 2.339 3.411 3.946 0.054 0.089 0.116 0.223t0.125 0.473-0.009 0.607-0.446 0.491-1.054 0.223l-4.571 0.071q-0.429 0.089-1-0.089t-0.929-0.393l-0.357-0.214q-0.536-0.375-1.25-1.143t-1.223-1.384-1.089-1.036-1.009-0.277q-0.054 0.018-0.143 0.063t-0.304 0.259-0.384 0.527-0.304 0.929-0.116 1.384q0 0.268-0.063 0.491t-0.134 0.33l-0.071 0.089q-0.321 0.339-0.946 0.393h-2.054q-1.268 0.071-2.607-0.295t-2.348-0.946-1.839-1.179-1.259-1.027l-0.446-0.429q-0.179-0.179-0.491-0.536t-1.277-1.625-1.893-2.696-2.188-3.768-2.33-4.857q-0.107-0.286-0.107-0.482t0.054-0.286l0.071-0.107q0.268-0.339 1.018-0.339l4.893-0.036q0.214 0.036 0.411 0.116t0.286 0.152l0.089 0.054q0.286 0.196 0.429 0.571 0.357 0.893 0.821 1.848t0.732 1.455l0.286 0.518q0.518 1.071 1 1.857t0.866 1.223 0.741 0.688 0.607 0.25 0.482-0.089q0.036-0.018 0.089-0.089t0.214-0.393 0.241-0.839 0.17-1.446 0-2.232q-0.036-0.714-0.161-1.304t-0.25-0.821l-0.107-0.214q-0.446-0.607-1.518-0.768-0.232-0.036 0.089-0.429 0.304-0.339 0.679-0.536 0.946-0.464 4.268-0.429 1.464 0.018 2.411 0.232 0.357 0.089 0.598 0.241t0.366 0.429 0.188 0.571 0.063 0.813-0.018 0.982-0.045 1.259-0.027 1.473q0 0.196-0.018 0.75t-0.009 0.857 0.063 0.723 0.205 0.696 0.402 0.438q0.143 0.036 0.304 0.071t0.464-0.196 0.679-0.616 0.929-1.196 1.214-1.92q1.071-1.857 1.911-4.018 0.071-0.179 0.179-0.313t0.196-0.188l0.071-0.054 0.089-0.045t0.232-0.054 0.357-0.009l5.143-0.036q0.696-0.089 1.143 0.045t0.554 0.295z"></path> </symbol> <symbol id="icon-search" viewBox="0 0 30 32"> <path class="path1" d="M20.571 14.857q0-3.304-2.348-5.652t-5.652-2.348-5.652 2.348-2.348 5.652 2.348 5.652 5.652 2.348 5.652-2.348 2.348-5.652zM29.714 29.714q0 0.929-0.679 1.607t-1.607 0.679q-0.964 0-1.607-0.679l-6.125-6.107q-3.196 2.214-7.125 2.214-2.554 0-4.884-0.991t-4.018-2.679-2.679-4.018-0.991-4.884 0.991-4.884 2.679-4.018 4.018-2.679 4.884-0.991 4.884 0.991 4.018 2.679 2.679 4.018 0.991 4.884q0 3.929-2.214 7.125l6.125 6.125q0.661 0.661 0.661 1.607z"></path> </symbol> <symbol id="icon-envelope-o" viewBox="0 0 32 32"> <path class="path1" d="M29.714 26.857v-13.714q-0.571 0.643-1.232 1.179-4.786 3.679-7.607 6.036-0.911 0.768-1.482 1.196t-1.545 0.866-1.83 0.438h-0.036q-0.857 0-1.83-0.438t-1.545-0.866-1.482-1.196q-2.821-2.357-7.607-6.036-0.661-0.536-1.232-1.179v13.714q0 0.232 0.17 0.402t0.402 0.17h26.286q0.232 0 0.402-0.17t0.17-0.402zM29.714 8.089v-0.438t-0.009-0.232-0.054-0.223-0.098-0.161-0.161-0.134-0.25-0.045h-26.286q-0.232 0-0.402 0.17t-0.17 0.402q0 3 2.625 5.071 3.446 2.714 7.161 5.661 0.107 0.089 0.625 0.527t0.821 0.67 0.795 0.563 0.902 0.491 0.768 0.161h0.036q0.357 0 0.768-0.161t0.902-0.491 0.795-0.563 0.821-0.67 0.625-0.527q3.714-2.946 7.161-5.661 0.964-0.768 1.795-2.063t0.83-2.348zM32 7.429v19.429q0 1.179-0.839 2.018t-2.018 0.839h-26.286q-1.179 0-2.018-0.839t-0.839-2.018v-19.429q0-1.179 0.839-2.018t2.018-0.839h26.286q1.179 0 2.018 0.839t0.839 2.018z"></path> </symbol> <symbol id="icon-close" viewBox="0 0 25 32"> <path class="path1" d="M23.179 23.607q0 0.714-0.5 1.214l-2.429 2.429q-0.5 0.5-1.214 0.5t-1.214-0.5l-5.25-5.25-5.25 5.25q-0.5 0.5-1.214 0.5t-1.214-0.5l-2.429-2.429q-0.5-0.5-0.5-1.214t0.5-1.214l5.25-5.25-5.25-5.25q-0.5-0.5-0.5-1.214t0.5-1.214l2.429-2.429q0.5-0.5 1.214-0.5t1.214 0.5l5.25 5.25 5.25-5.25q0.5-0.5 1.214-0.5t1.214 0.5l2.429 2.429q0.5 0.5 0.5 1.214t-0.5 1.214l-5.25 5.25 5.25 5.25q0.5 0.5 0.5 1.214z"></path> </symbol> <symbol id="icon-angle-down" viewBox="0 0 21 32"> <path class="path1" d="M19.196 13.143q0 0.232-0.179 0.411l-8.321 8.321q-0.179 0.179-0.411 0.179t-0.411-0.179l-8.321-8.321q-0.179-0.179-0.179-0.411t0.179-0.411l0.893-0.893q0.179-0.179 0.411-0.179t0.411 0.179l7.018 7.018 7.018-7.018q0.179-0.179 0.411-0.179t0.411 0.179l0.893 0.893q0.179 0.179 0.179 0.411z"></path> </symbol> <symbol id="icon-folder-open" viewBox="0 0 34 32"> <path class="path1" d="M33.554 17q0 0.554-0.554 1.179l-6 7.071q-0.768 0.911-2.152 1.545t-2.563 0.634h-19.429q-0.607 0-1.080-0.232t-0.473-0.768q0-0.554 0.554-1.179l6-7.071q0.768-0.911 2.152-1.545t2.563-0.634h19.429q0.607 0 1.080 0.232t0.473 0.768zM27.429 10.857v2.857h-14.857q-1.679 0-3.518 0.848t-2.929 2.134l-6.107 7.179q0-0.071-0.009-0.223t-0.009-0.223v-17.143q0-1.643 1.179-2.821t2.821-1.179h5.714q1.643 0 2.821 1.179t1.179 2.821v0.571h9.714q1.643 0 2.821 1.179t1.179 2.821z"></path> </symbol> <symbol id="icon-twitter" viewBox="0 0 30 32"> <path class="path1" d="M28.929 7.286q-1.196 1.75-2.893 2.982 0.018 0.25 0.018 0.75 0 2.321-0.679 4.634t-2.063 4.437-3.295 3.759-4.607 2.607-5.768 0.973q-4.839 0-8.857-2.589 0.625 0.071 1.393 0.071 4.018 0 7.161-2.464-1.875-0.036-3.357-1.152t-2.036-2.848q0.589 0.089 1.089 0.089 0.768 0 1.518-0.196-2-0.411-3.313-1.991t-1.313-3.67v-0.071q1.214 0.679 2.607 0.732-1.179-0.786-1.875-2.054t-0.696-2.75q0-1.571 0.786-2.911 2.161 2.661 5.259 4.259t6.634 1.777q-0.143-0.679-0.143-1.321 0-2.393 1.688-4.080t4.080-1.688q2.5 0 4.214 1.821 1.946-0.375 3.661-1.393-0.661 2.054-2.536 3.179 1.661-0.179 3.321-0.893z"></path> </symbol> <symbol id="icon-facebook" viewBox="0 0 19 32"> <path class="path1" d="M17.125 0.214v4.714h-2.804q-1.536 0-2.071 0.643t-0.536 1.929v3.375h5.232l-0.696 5.286h-4.536v13.554h-5.464v-13.554h-4.554v-5.286h4.554v-3.893q0-3.321 1.857-5.152t4.946-1.83q2.625 0 4.071 0.214z"></path> </symbol> <symbol id="icon-github" viewBox="0 0 27 32"> <path class="path1" d="M13.714 2.286q3.732 0 6.884 1.839t4.991 4.991 1.839 6.884q0 4.482-2.616 8.063t-6.759 4.955q-0.482 0.089-0.714-0.125t-0.232-0.536q0-0.054 0.009-1.366t0.009-2.402q0-1.732-0.929-2.536 1.018-0.107 1.83-0.321t1.679-0.696 1.446-1.188 0.946-1.875 0.366-2.688q0-2.125-1.411-3.679 0.661-1.625-0.143-3.643-0.5-0.161-1.446 0.196t-1.643 0.786l-0.679 0.429q-1.661-0.464-3.429-0.464t-3.429 0.464q-0.286-0.196-0.759-0.482t-1.491-0.688-1.518-0.241q-0.804 2.018-0.143 3.643-1.411 1.554-1.411 3.679 0 1.518 0.366 2.679t0.938 1.875 1.438 1.196 1.679 0.696 1.83 0.321q-0.696 0.643-0.875 1.839-0.375 0.179-0.804 0.268t-1.018 0.089-1.17-0.384-0.991-1.116q-0.339-0.571-0.866-0.929t-0.884-0.429l-0.357-0.054q-0.375 0-0.518 0.080t-0.089 0.205 0.161 0.25 0.232 0.214l0.125 0.089q0.393 0.179 0.777 0.679t0.563 0.911l0.179 0.411q0.232 0.679 0.786 1.098t1.196 0.536 1.241 0.125 0.991-0.063l0.411-0.071q0 0.679 0.009 1.58t0.009 0.973q0 0.321-0.232 0.536t-0.714 0.125q-4.143-1.375-6.759-4.955t-2.616-8.063q0-3.732 1.839-6.884t4.991-4.991 6.884-1.839zM5.196 21.982q0.054-0.125-0.125-0.214-0.179-0.054-0.232 0.036-0.054 0.125 0.125 0.214 0.161 0.107 0.232-0.036zM5.75 22.589q0.125-0.089-0.036-0.286-0.179-0.161-0.286-0.054-0.125 0.089 0.036 0.286 0.179 0.179 0.286 0.054zM6.286 23.393q0.161-0.125 0-0.339-0.143-0.232-0.304-0.107-0.161 0.089 0 0.321t0.304 0.125zM7.036 24.143q0.143-0.143-0.071-0.339-0.214-0.214-0.357-0.054-0.161 0.143 0.071 0.339 0.214 0.214 0.357 0.054zM8.054 24.589q0.054-0.196-0.232-0.286-0.268-0.071-0.339 0.125t0.232 0.268q0.268 0.107 0.339-0.107zM9.179 24.679q0-0.232-0.304-0.196-0.286 0-0.286 0.196 0 0.232 0.304 0.196 0.286 0 0.286-0.196zM10.214 24.5q-0.036-0.196-0.321-0.161-0.286 0.054-0.25 0.268t0.321 0.143 0.25-0.25z"></path> </symbol> <symbol id="icon-bars" viewBox="0 0 27 32"> <path class="path1" d="M27.429 24v2.286q0 0.464-0.339 0.804t-0.804 0.339h-25.143q-0.464 0-0.804-0.339t-0.339-0.804v-2.286q0-0.464 0.339-0.804t0.804-0.339h25.143q0.464 0 0.804 0.339t0.339 0.804zM27.429 14.857v2.286q0 0.464-0.339 0.804t-0.804 0.339h-25.143q-0.464 0-0.804-0.339t-0.339-0.804v-2.286q0-0.464 0.339-0.804t0.804-0.339h25.143q0.464 0 0.804 0.339t0.339 0.804zM27.429 5.714v2.286q0 0.464-0.339 0.804t-0.804 0.339h-25.143q-0.464 0-0.804-0.339t-0.339-0.804v-2.286q0-0.464 0.339-0.804t0.804-0.339h25.143q0.464 0 0.804 0.339t0.339 0.804z"></path> </symbol> <symbol id="icon-google-plus" viewBox="0 0 41 32"> <path class="path1" d="M25.661 16.304q0 3.714-1.554 6.616t-4.429 4.536-6.589 1.634q-2.661 0-5.089-1.036t-4.179-2.786-2.786-4.179-1.036-5.089 1.036-5.089 2.786-4.179 4.179-2.786 5.089-1.036q5.107 0 8.768 3.429l-3.554 3.411q-2.089-2.018-5.214-2.018-2.196 0-4.063 1.107t-2.955 3.009-1.089 4.152 1.089 4.152 2.955 3.009 4.063 1.107q1.482 0 2.723-0.411t2.045-1.027 1.402-1.402 0.875-1.482 0.384-1.321h-7.429v-4.5h12.357q0.214 1.125 0.214 2.179zM41.143 14.125v3.75h-3.732v3.732h-3.75v-3.732h-3.732v-3.75h3.732v-3.732h3.75v3.732h3.732z"></path> </symbol> <symbol id="icon-linkedin" viewBox="0 0 27 32"> <path class="path1" d="M6.232 11.161v17.696h-5.893v-17.696h5.893zM6.607 5.696q0.018 1.304-0.902 2.179t-2.42 0.875h-0.036q-1.464 0-2.357-0.875t-0.893-2.179q0-1.321 0.92-2.188t2.402-0.866 2.375 0.866 0.911 2.188zM27.429 18.714v10.143h-5.875v-9.464q0-1.875-0.723-2.938t-2.259-1.063q-1.125 0-1.884 0.616t-1.134 1.527q-0.196 0.536-0.196 1.446v9.875h-5.875q0.036-7.125 0.036-11.554t-0.018-5.286l-0.018-0.857h5.875v2.571h-0.036q0.357-0.571 0.732-1t1.009-0.929 1.554-0.777 2.045-0.277q3.054 0 4.911 2.027t1.857 5.938z"></path> </symbol> <symbol id="icon-quote-right" viewBox="0 0 30 32"> <path class="path1" d="M13.714 5.714v12.571q0 1.857-0.723 3.545t-1.955 2.92-2.92 1.955-3.545 0.723h-1.143q-0.464 0-0.804-0.339t-0.339-0.804v-2.286q0-0.464 0.339-0.804t0.804-0.339h1.143q1.893 0 3.232-1.339t1.339-3.232v-0.571q0-0.714-0.5-1.214t-1.214-0.5h-4q-1.429 0-2.429-1t-1-2.429v-6.857q0-1.429 1-2.429t2.429-1h6.857q1.429 0 2.429 1t1 2.429zM29.714 5.714v12.571q0 1.857-0.723 3.545t-1.955 2.92-2.92 1.955-3.545 0.723h-1.143q-0.464 0-0.804-0.339t-0.339-0.804v-2.286q0-0.464 0.339-0.804t0.804-0.339h1.143q1.893 0 3.232-1.339t1.339-3.232v-0.571q0-0.714-0.5-1.214t-1.214-0.5h-4q-1.429 0-2.429-1t-1-2.429v-6.857q0-1.429 1-2.429t2.429-1h6.857q1.429 0 2.429 1t1 2.429z"></path> </symbol> <symbol id="icon-mail-reply" viewBox="0 0 32 32"> <path class="path1" d="M32 20q0 2.964-2.268 8.054-0.054 0.125-0.188 0.429t-0.241 0.536-0.232 0.393q-0.214 0.304-0.5 0.304-0.268 0-0.42-0.179t-0.152-0.446q0-0.161 0.045-0.473t0.045-0.42q0.089-1.214 0.089-2.196 0-1.804-0.313-3.232t-0.866-2.473-1.429-1.804-1.884-1.241-2.375-0.759-2.75-0.384-3.134-0.107h-4v4.571q0 0.464-0.339 0.804t-0.804 0.339-0.804-0.339l-9.143-9.143q-0.339-0.339-0.339-0.804t0.339-0.804l9.143-9.143q0.339-0.339 0.804-0.339t0.804 0.339 0.339 0.804v4.571h4q12.732 0 15.625 7.196 0.946 2.393 0.946 5.946z"></path> </symbol> <symbol id="icon-youtube" viewBox="0 0 27 32"> <path class="path1" d="M17.339 22.214v3.768q0 1.196-0.696 1.196-0.411 0-0.804-0.393v-5.375q0.393-0.393 0.804-0.393 0.696 0 0.696 1.196zM23.375 22.232v0.821h-1.607v-0.821q0-1.214 0.804-1.214t0.804 1.214zM6.125 18.339h1.911v-1.679h-5.571v1.679h1.875v10.161h1.786v-10.161zM11.268 28.5h1.589v-8.821h-1.589v6.75q-0.536 0.75-1.018 0.75-0.321 0-0.375-0.375-0.018-0.054-0.018-0.625v-6.5h-1.589v6.982q0 0.875 0.143 1.304 0.214 0.661 1.036 0.661 0.857 0 1.821-1.089v0.964zM18.929 25.857v-3.518q0-1.304-0.161-1.768-0.304-1-1.268-1-0.893 0-1.661 0.964v-3.875h-1.589v11.839h1.589v-0.857q0.804 0.982 1.661 0.982 0.964 0 1.268-0.982 0.161-0.482 0.161-1.786zM24.964 25.679v-0.232h-1.625q0 0.911-0.036 1.089-0.125 0.643-0.714 0.643-0.821 0-0.821-1.232v-1.554h3.196v-1.839q0-1.411-0.482-2.071-0.696-0.911-1.893-0.911-1.214 0-1.911 0.911-0.5 0.661-0.5 2.071v3.089q0 1.411 0.518 2.071 0.696 0.911 1.929 0.911 1.286 0 1.929-0.946 0.321-0.482 0.375-0.964 0.036-0.161 0.036-1.036zM14.107 9.375v-3.75q0-1.232-0.768-1.232t-0.768 1.232v3.75q0 1.25 0.768 1.25t0.768-1.25zM26.946 22.786q0 4.179-0.464 6.25-0.25 1.054-1.036 1.768t-1.821 0.821q-3.286 0.375-9.911 0.375t-9.911-0.375q-1.036-0.107-1.83-0.821t-1.027-1.768q-0.464-2-0.464-6.25 0-4.179 0.464-6.25 0.25-1.054 1.036-1.768t1.839-0.839q3.268-0.357 9.893-0.357t9.911 0.357q1.036 0.125 1.83 0.839t1.027 1.768q0.464 2 0.464 6.25zM9.125 0h1.821l-2.161 7.125v4.839h-1.786v-4.839q-0.25-1.321-1.089-3.786-0.661-1.839-1.161-3.339h1.893l1.268 4.696zM15.732 5.946v3.125q0 1.446-0.5 2.107-0.661 0.911-1.893 0.911-1.196 0-1.875-0.911-0.5-0.679-0.5-2.107v-3.125q0-1.429 0.5-2.089 0.679-0.911 1.875-0.911 1.232 0 1.893 0.911 0.5 0.661 0.5 2.089zM21.714 3.054v8.911h-1.625v-0.982q-0.946 1.107-1.839 1.107-0.821 0-1.054-0.661-0.143-0.429-0.143-1.339v-7.036h1.625v6.554q0 0.589 0.018 0.625 0.054 0.393 0.375 0.393 0.482 0 1.018-0.768v-6.804h1.625z"></path> </symbol> <symbol id="icon-dropbox" viewBox="0 0 32 32"> <path class="path1" d="M7.179 12.625l8.821 5.446-6.107 5.089-8.75-5.696zM24.786 22.536v1.929l-8.75 5.232v0.018l-0.018-0.018-0.018 0.018v-0.018l-8.732-5.232v-1.929l2.625 1.714 6.107-5.071v-0.036l0.018 0.018 0.018-0.018v0.036l6.125 5.071zM9.893 2.107l6.107 5.089-8.821 5.429-6.036-4.821zM24.821 12.625l6.036 4.839-8.732 5.696-6.125-5.089zM22.125 2.107l8.732 5.696-6.036 4.821-8.821-5.429z"></path> </symbol> <symbol id="icon-instagram" viewBox="0 0 27 32"> <path class="path1" d="M18.286 16q0-1.893-1.339-3.232t-3.232-1.339-3.232 1.339-1.339 3.232 1.339 3.232 3.232 1.339 3.232-1.339 1.339-3.232zM20.75 16q0 2.929-2.054 4.982t-4.982 2.054-4.982-2.054-2.054-4.982 2.054-4.982 4.982-2.054 4.982 2.054 2.054 4.982zM22.679 8.679q0 0.679-0.482 1.161t-1.161 0.482-1.161-0.482-0.482-1.161 0.482-1.161 1.161-0.482 1.161 0.482 0.482 1.161zM13.714 4.75q-0.125 0-1.366-0.009t-1.884 0-1.723 0.054-1.839 0.179-1.277 0.33q-0.893 0.357-1.571 1.036t-1.036 1.571q-0.196 0.518-0.33 1.277t-0.179 1.839-0.054 1.723 0 1.884 0.009 1.366-0.009 1.366 0 1.884 0.054 1.723 0.179 1.839 0.33 1.277q0.357 0.893 1.036 1.571t1.571 1.036q0.518 0.196 1.277 0.33t1.839 0.179 1.723 0.054 1.884 0 1.366-0.009 1.366 0.009 1.884 0 1.723-0.054 1.839-0.179 1.277-0.33q0.893-0.357 1.571-1.036t1.036-1.571q0.196-0.518 0.33-1.277t0.179-1.839 0.054-1.723 0-1.884-0.009-1.366 0.009-1.366 0-1.884-0.054-1.723-0.179-1.839-0.33-1.277q-0.357-0.893-1.036-1.571t-1.571-1.036q-0.518-0.196-1.277-0.33t-1.839-0.179-1.723-0.054-1.884 0-1.366 0.009zM27.429 16q0 4.089-0.089 5.661-0.179 3.714-2.214 5.75t-5.75 2.214q-1.571 0.089-5.661 0.089t-5.661-0.089q-3.714-0.179-5.75-2.214t-2.214-5.75q-0.089-1.571-0.089-5.661t0.089-5.661q0.179-3.714 2.214-5.75t5.75-2.214q1.571-0.089 5.661-0.089t5.661 0.089q3.714 0.179 5.75 2.214t2.214 5.75q0.089 1.571 0.089 5.661z"></path> </symbol> <symbol id="icon-flickr" viewBox="0 0 27 32"> <path class="path1" d="M22.286 2.286q2.125 0 3.634 1.509t1.509 3.634v17.143q0 2.125-1.509 3.634t-3.634 1.509h-17.143q-2.125 0-3.634-1.509t-1.509-3.634v-17.143q0-2.125 1.509-3.634t3.634-1.509h17.143zM12.464 16q0-1.571-1.107-2.679t-2.679-1.107-2.679 1.107-1.107 2.679 1.107 2.679 2.679 1.107 2.679-1.107 1.107-2.679zM22.536 16q0-1.571-1.107-2.679t-2.679-1.107-2.679 1.107-1.107 2.679 1.107 2.679 2.679 1.107 2.679-1.107 1.107-2.679z"></path> </symbol> <symbol id="icon-tumblr" viewBox="0 0 19 32"> <path class="path1" d="M16.857 23.732l1.429 4.232q-0.411 0.625-1.982 1.179t-3.161 0.571q-1.857 0.036-3.402-0.464t-2.545-1.321-1.696-1.893-0.991-2.143-0.295-2.107v-9.714h-3v-3.839q1.286-0.464 2.304-1.241t1.625-1.607 1.036-1.821 0.607-1.768 0.268-1.58q0.018-0.089 0.080-0.152t0.134-0.063h4.357v7.571h5.946v4.5h-5.964v9.25q0 0.536 0.116 1t0.402 0.938 0.884 0.741 1.455 0.25q1.393-0.036 2.393-0.518z"></path> </symbol> <symbol id="icon-dockerhub" viewBox="0 0 24 28"> <path class="path1" d="M1.597 10.257h2.911v2.83H1.597v-2.83zm3.573 0h2.91v2.83H5.17v-2.83zm0-3.627h2.91v2.829H5.17V6.63zm3.57 3.627h2.912v2.83H8.74v-2.83zm0-3.627h2.912v2.829H8.74V6.63zm3.573 3.627h2.911v2.83h-2.911v-2.83zm0-3.627h2.911v2.829h-2.911V6.63zm3.572 3.627h2.911v2.83h-2.911v-2.83zM12.313 3h2.911v2.83h-2.911V3zm-6.65 14.173c-.449 0-.812.354-.812.788 0 .435.364.788.812.788.447 0 .811-.353.811-.788 0-.434-.363-.788-.811-.788"></path> <path class="path2" d="M28.172 11.721c-.978-.549-2.278-.624-3.388-.306-.136-1.146-.91-2.149-1.83-2.869l-.366-.286-.307.345c-.618.692-.8 1.845-.718 2.73.063.651.273 1.312.685 1.834-.313.183-.668.328-.985.434-.646.212-1.347.33-2.028.33H.083l-.042.429c-.137 1.432.065 2.866.674 4.173l.262.519.03.048c1.8 2.973 4.963 4.225 8.41 4.225 6.672 0 12.174-2.896 14.702-9.015 1.689.085 3.417-.4 4.243-1.968l.211-.4-.401-.223zM5.664 19.458c-.85 0-1.542-.671-1.542-1.497 0-.825.691-1.498 1.541-1.498.849 0 1.54.672 1.54 1.497s-.69 1.498-1.539 1.498z"></path> </symbol> <symbol id="icon-dribbble" viewBox="0 0 27 32"> <path class="path1" d="M18.286 26.786q-0.75-4.304-2.5-8.893h-0.036l-0.036 0.018q-0.286 0.107-0.768 0.295t-1.804 0.875-2.446 1.464-2.339 2.045-1.839 2.643l-0.268-0.196q3.286 2.679 7.464 2.679 2.357 0 4.571-0.929zM14.982 15.946q-0.375-0.875-0.946-1.982-5.554 1.661-12.018 1.661-0.018 0.125-0.018 0.375 0 2.214 0.786 4.223t2.214 3.598q0.893-1.589 2.205-2.973t2.545-2.223 2.33-1.446 1.777-0.857l0.661-0.232q0.071-0.018 0.232-0.063t0.232-0.080zM13.071 12.161q-2.143-3.804-4.357-6.75-2.464 1.161-4.179 3.321t-2.286 4.857q5.393 0 10.821-1.429zM25.286 17.857q-3.75-1.071-7.304-0.518 1.554 4.268 2.286 8.375 1.982-1.339 3.304-3.384t1.714-4.473zM10.911 4.625q-0.018 0-0.036 0.018 0.018-0.018 0.036-0.018zM21.446 7.214q-3.304-2.929-7.732-2.929-1.357 0-2.768 0.339 2.339 3.036 4.393 6.821 1.232-0.464 2.321-1.080t1.723-1.098 1.17-1.018 0.67-0.723zM25.429 15.875q-0.054-4.143-2.661-7.321l-0.018 0.018q-0.161 0.214-0.339 0.438t-0.777 0.795-1.268 1.080-1.786 1.161-2.348 1.152q0.446 0.946 0.786 1.696 0.036 0.107 0.116 0.313t0.134 0.295q0.643-0.089 1.33-0.125t1.313-0.036 1.232 0.027 1.143 0.071 1.009 0.098 0.857 0.116 0.652 0.107 0.446 0.080zM27.429 16q0 3.732-1.839 6.884t-4.991 4.991-6.884 1.839-6.884-1.839-4.991-4.991-1.839-6.884 1.839-6.884 4.991-4.991 6.884-1.839 6.884 1.839 4.991 4.991 1.839 6.884z"></path> </symbol> <symbol id="icon-skype" viewBox="0 0 27 32"> <path class="path1" d="M20.946 18.982q0-0.893-0.348-1.634t-0.866-1.223-1.304-0.875-1.473-0.607-1.563-0.411l-1.857-0.429q-0.536-0.125-0.786-0.188t-0.625-0.205-0.536-0.286-0.295-0.375-0.134-0.536q0-1.375 2.571-1.375 0.768 0 1.375 0.214t0.964 0.509 0.679 0.598 0.714 0.518 0.857 0.214q0.839 0 1.348-0.571t0.509-1.375q0-0.982-1-1.777t-2.536-1.205-3.25-0.411q-1.214 0-2.357 0.277t-2.134 0.839-1.589 1.554-0.598 2.295q0 1.089 0.339 1.902t1 1.348 1.429 0.866 1.839 0.58l2.607 0.643q1.607 0.393 2 0.643 0.571 0.357 0.571 1.071 0 0.696-0.714 1.152t-1.875 0.455q-0.911 0-1.634-0.286t-1.161-0.688-0.813-0.804-0.821-0.688-0.964-0.286q-0.893 0-1.348 0.536t-0.455 1.339q0 1.643 2.179 2.813t5.196 1.17q1.304 0 2.5-0.33t2.188-0.955 1.58-1.67 0.589-2.348zM27.429 22.857q0 2.839-2.009 4.848t-4.848 2.009q-2.321 0-4.179-1.429-1.375 0.286-2.679 0.286-2.554 0-4.884-0.991t-4.018-2.679-2.679-4.018-0.991-4.884q0-1.304 0.286-2.679-1.429-1.857-1.429-4.179 0-2.839 2.009-4.848t4.848-2.009q2.321 0 4.179 1.429 1.375-0.286 2.679-0.286 2.554 0 4.884 0.991t4.018 2.679 2.679 4.018 0.991 4.884q0 1.304-0.286 2.679 1.429 1.857 1.429 4.179z"></path> </symbol> <symbol id="icon-foursquare" viewBox="0 0 23 32"> <path class="path1" d="M17.857 7.75l0.661-3.464q0.089-0.411-0.161-0.714t-0.625-0.304h-12.714q-0.411 0-0.688 0.304t-0.277 0.661v19.661q0 0.125 0.107 0.018l5.196-6.286q0.411-0.464 0.679-0.598t0.857-0.134h4.268q0.393 0 0.661-0.259t0.321-0.527q0.429-2.321 0.661-3.411 0.071-0.375-0.205-0.714t-0.652-0.339h-5.25q-0.518 0-0.857-0.339t-0.339-0.857v-0.75q0-0.518 0.339-0.848t0.857-0.33h6.179q0.321 0 0.625-0.241t0.357-0.527zM21.911 3.786q-0.268 1.304-0.955 4.759t-1.241 6.25-0.625 3.098q-0.107 0.393-0.161 0.58t-0.25 0.58-0.438 0.589-0.688 0.375-1.036 0.179h-4.839q-0.232 0-0.393 0.179-0.143 0.161-7.607 8.821-0.393 0.446-1.045 0.509t-0.866-0.098q-0.982-0.393-0.982-1.75v-25.179q0-0.982 0.679-1.83t2.143-0.848h15.857q1.696 0 2.268 0.946t0.179 2.839zM21.911 3.786l-2.821 14.107q0.071-0.304 0.625-3.098t1.241-6.25 0.955-4.759z"></path> </symbol> <symbol id="icon-wordpress" viewBox="0 0 32 32"> <path class="path1" d="M2.268 16q0-2.911 1.196-5.589l6.554 17.946q-3.5-1.696-5.625-5.018t-2.125-7.339zM25.268 15.304q0 0.339-0.045 0.688t-0.179 0.884-0.205 0.786-0.313 1.054-0.313 1.036l-1.357 4.571-4.964-14.75q0.821-0.054 1.571-0.143 0.339-0.036 0.464-0.33t-0.045-0.554-0.509-0.241l-3.661 0.179q-1.339-0.018-3.607-0.179-0.214-0.018-0.366 0.089t-0.205 0.268-0.027 0.33 0.161 0.295 0.348 0.143l1.429 0.143 2.143 5.857-3 9-5-14.857q0.821-0.054 1.571-0.143 0.339-0.036 0.464-0.33t-0.045-0.554-0.509-0.241l-3.661 0.179q-0.125 0-0.411-0.009t-0.464-0.009q1.875-2.857 4.902-4.527t6.563-1.67q2.625 0 5.009 0.946t4.259 2.661h-0.179q-0.982 0-1.643 0.723t-0.661 1.705q0 0.214 0.036 0.429t0.071 0.384 0.143 0.411 0.161 0.375 0.214 0.402 0.223 0.375 0.259 0.429 0.25 0.411q1.125 1.911 1.125 3.786zM16.232 17.196l4.232 11.554q0.018 0.107 0.089 0.196-2.25 0.786-4.554 0.786-2 0-3.875-0.571zM28.036 9.411q1.696 3.107 1.696 6.589 0 3.732-1.857 6.884t-4.982 4.973l4.196-12.107q1.054-3.018 1.054-4.929 0-0.75-0.107-1.411zM16 0q3.25 0 6.214 1.268t5.107 3.411 3.411 5.107 1.268 6.214-1.268 6.214-3.411 5.107-5.107 3.411-6.214 1.268-6.214-1.268-5.107-3.411-3.411-5.107-1.268-6.214 1.268-6.214 3.411-5.107 5.107-3.411 6.214-1.268zM16 31.268q3.089 0 5.92-1.214t4.875-3.259 3.259-4.875 1.214-5.92-1.214-5.92-3.259-4.875-4.875-3.259-5.92-1.214-5.92 1.214-4.875 3.259-3.259 4.875-1.214 5.92 1.214 5.92 3.259 4.875 4.875 3.259 5.92 1.214z"></path> </symbol> <symbol id="icon-stumbleupon" viewBox="0 0 34 32"> <path class="path1" d="M18.964 12.714v-2.107q0-0.75-0.536-1.286t-1.286-0.536-1.286 0.536-0.536 1.286v10.929q0 3.125-2.25 5.339t-5.411 2.214q-3.179 0-5.42-2.241t-2.241-5.42v-4.75h5.857v4.679q0 0.768 0.536 1.295t1.286 0.527 1.286-0.527 0.536-1.295v-11.071q0-3.054 2.259-5.214t5.384-2.161q3.143 0 5.393 2.179t2.25 5.25v2.429l-3.482 1.036zM28.429 16.679h5.857v4.75q0 3.179-2.241 5.42t-5.42 2.241q-3.161 0-5.411-2.223t-2.25-5.366v-4.786l2.339 1.089 3.482-1.036v4.821q0 0.75 0.536 1.277t1.286 0.527 1.286-0.527 0.536-1.277v-4.911z"></path> </symbol> <symbol id="icon-digg" viewBox="0 0 37 32"> <path class="path1" d="M5.857 5.036h3.643v17.554h-9.5v-12.446h5.857v-5.107zM5.857 19.661v-6.589h-2.196v6.589h2.196zM10.964 10.143v12.446h3.661v-12.446h-3.661zM10.964 5.036v3.643h3.661v-3.643h-3.661zM16.089 10.143h9.518v16.821h-9.518v-2.911h5.857v-1.464h-5.857v-12.446zM21.946 19.661v-6.589h-2.196v6.589h2.196zM27.071 10.143h9.5v16.821h-9.5v-2.911h5.839v-1.464h-5.839v-12.446zM32.911 19.661v-6.589h-2.196v6.589h2.196z"></path> </symbol> <symbol id="icon-spotify" viewBox="0 0 27 32"> <path class="path1" d="M20.125 21.607q0-0.571-0.536-0.911-3.446-2.054-7.982-2.054-2.375 0-5.125 0.607-0.75 0.161-0.75 0.929 0 0.357 0.241 0.616t0.634 0.259q0.089 0 0.661-0.143 2.357-0.482 4.339-0.482 4.036 0 7.089 1.839 0.339 0.196 0.589 0.196 0.339 0 0.589-0.241t0.25-0.616zM21.839 17.768q0-0.714-0.625-1.089-4.232-2.518-9.786-2.518-2.732 0-5.411 0.75-0.857 0.232-0.857 1.143 0 0.446 0.313 0.759t0.759 0.313q0.125 0 0.661-0.143 2.179-0.589 4.482-0.589 4.982 0 8.714 2.214 0.429 0.232 0.679 0.232 0.446 0 0.759-0.313t0.313-0.759zM23.768 13.339q0-0.839-0.714-1.25-2.25-1.304-5.232-1.973t-6.125-0.67q-3.643 0-6.5 0.839-0.411 0.125-0.688 0.455t-0.277 0.866q0 0.554 0.366 0.929t0.92 0.375q0.196 0 0.714-0.143 2.375-0.661 5.482-0.661 2.839 0 5.527 0.607t4.527 1.696q0.375 0.214 0.714 0.214 0.518 0 0.902-0.366t0.384-0.92zM27.429 16q0 3.732-1.839 6.884t-4.991 4.991-6.884 1.839-6.884-1.839-4.991-4.991-1.839-6.884 1.839-6.884 4.991-4.991 6.884-1.839 6.884 1.839 4.991 4.991 1.839 6.884z"></path> </symbol> <symbol id="icon-soundcloud" viewBox="0 0 41 32"> <path class="path1" d="M14 24.5l0.286-4.304-0.286-9.339q-0.018-0.179-0.134-0.304t-0.295-0.125q-0.161 0-0.286 0.125t-0.125 0.304l-0.25 9.339 0.25 4.304q0.018 0.179 0.134 0.295t0.277 0.116q0.393 0 0.429-0.411zM19.286 23.982l0.196-3.768-0.214-10.464q0-0.286-0.232-0.429-0.143-0.089-0.286-0.089t-0.286 0.089q-0.232 0.143-0.232 0.429l-0.018 0.107-0.179 10.339q0 0.018 0.196 4.214v0.018q0 0.179 0.107 0.304 0.161 0.196 0.411 0.196 0.196 0 0.357-0.161 0.161-0.125 0.161-0.357zM0.625 17.911l0.357 2.286-0.357 2.25q-0.036 0.161-0.161 0.161t-0.161-0.161l-0.304-2.25 0.304-2.286q0.036-0.161 0.161-0.161t0.161 0.161zM2.161 16.5l0.464 3.696-0.464 3.625q-0.036 0.161-0.179 0.161-0.161 0-0.161-0.179l-0.411-3.607 0.411-3.696q0-0.161 0.161-0.161 0.143 0 0.179 0.161zM3.804 15.821l0.446 4.375-0.446 4.232q0 0.196-0.196 0.196-0.179 0-0.214-0.196l-0.375-4.232 0.375-4.375q0.036-0.214 0.214-0.214 0.196 0 0.196 0.214zM5.482 15.696l0.411 4.5-0.411 4.357q-0.036 0.232-0.25 0.232-0.232 0-0.232-0.232l-0.375-4.357 0.375-4.5q0-0.232 0.232-0.232 0.214 0 0.25 0.232zM7.161 16.018l0.375 4.179-0.375 4.393q-0.036 0.286-0.286 0.286-0.107 0-0.188-0.080t-0.080-0.205l-0.357-4.393 0.357-4.179q0-0.107 0.080-0.188t0.188-0.080q0.25 0 0.286 0.268zM8.839 13.411l0.375 6.786-0.375 4.393q0 0.125-0.089 0.223t-0.214 0.098q-0.286 0-0.321-0.321l-0.321-4.393 0.321-6.786q0.036-0.321 0.321-0.321 0.125 0 0.214 0.098t0.089 0.223zM10.518 11.875l0.339 8.357-0.339 4.357q0 0.143-0.098 0.241t-0.241 0.098q-0.321 0-0.357-0.339l-0.286-4.357 0.286-8.357q0.036-0.339 0.357-0.339 0.143 0 0.241 0.098t0.098 0.241zM12.268 11.161l0.321 9.036-0.321 4.321q-0.036 0.375-0.393 0.375-0.339 0-0.375-0.375l-0.286-4.321 0.286-9.036q0-0.161 0.116-0.277t0.259-0.116q0.161 0 0.268 0.116t0.125 0.277zM19.268 24.411v0 0zM15.732 11.089l0.268 9.107-0.268 4.268q0 0.179-0.134 0.313t-0.313 0.134-0.304-0.125-0.143-0.321l-0.25-4.268 0.25-9.107q0-0.196 0.134-0.321t0.313-0.125 0.313 0.125 0.134 0.321zM17.5 11.429l0.25 8.786-0.25 4.214q0 0.196-0.143 0.339t-0.339 0.143-0.339-0.143-0.161-0.339l-0.214-4.214 0.214-8.786q0.018-0.214 0.161-0.357t0.339-0.143 0.33 0.143 0.152 0.357zM21.286 20.214l-0.25 4.125q0 0.232-0.161 0.393t-0.393 0.161-0.393-0.161-0.179-0.393l-0.107-2.036-0.107-2.089 0.214-11.357v-0.054q0.036-0.268 0.214-0.429 0.161-0.125 0.357-0.125 0.143 0 0.268 0.089 0.25 0.143 0.286 0.464zM41.143 19.875q0 2.089-1.482 3.563t-3.571 1.473h-14.036q-0.232-0.036-0.393-0.196t-0.161-0.393v-16.054q0-0.411 0.5-0.589 1.518-0.607 3.232-0.607 3.482 0 6.036 2.348t2.857 5.777q0.946-0.393 1.964-0.393 2.089 0 3.571 1.482t1.482 3.589z"></path> </symbol> <symbol id="icon-codepen" viewBox="0 0 32 32"> <path class="path1" d="M3.857 20.875l10.768 7.179v-6.411l-5.964-3.982zM2.75 18.304l3.446-2.304-3.446-2.304v4.607zM17.375 28.054l10.768-7.179-4.804-3.214-5.964 3.982v6.411zM16 19.25l4.857-3.25-4.857-3.25-4.857 3.25zM8.661 14.339l5.964-3.982v-6.411l-10.768 7.179zM25.804 16l3.446 2.304v-4.607zM23.339 14.339l4.804-3.214-10.768-7.179v6.411zM32 11.125v9.75q0 0.732-0.607 1.143l-14.625 9.75q-0.375 0.232-0.768 0.232t-0.768-0.232l-14.625-9.75q-0.607-0.411-0.607-1.143v-9.75q0-0.732 0.607-1.143l14.625-9.75q0.375-0.232 0.768-0.232t0.768 0.232l14.625 9.75q0.607 0.411 0.607 1.143z"></path> </symbol> <symbol id="icon-twitch" viewBox="0 0 32 32"> <path class="path1" d="M16 7.75v7.75h-2.589v-7.75h2.589zM23.107 7.75v7.75h-2.589v-7.75h2.589zM23.107 21.321l4.518-4.536v-14.196h-21.321v18.732h5.821v3.875l3.875-3.875h7.107zM30.214 0v18.089l-7.75 7.75h-5.821l-3.875 3.875h-3.875v-3.875h-7.107v-20.679l1.946-5.161h26.482z"></path> </symbol> <symbol id="icon-meanpath" viewBox="0 0 27 32"> <path class="path1" d="M23.411 15.036v2.036q0 0.429-0.241 0.679t-0.67 0.25h-3.607q-0.429 0-0.679-0.25t-0.25-0.679v-2.036q0-0.429 0.25-0.679t0.679-0.25h3.607q0.429 0 0.67 0.25t0.241 0.679zM14.661 19.143v-4.464q0-0.946-0.58-1.527t-1.527-0.58h-2.375q-1.214 0-1.714 0.929-0.5-0.929-1.714-0.929h-2.321q-0.946 0-1.527 0.58t-0.58 1.527v4.464q0 0.393 0.375 0.393h0.982q0.393 0 0.393-0.393v-4.107q0-0.429 0.241-0.679t0.688-0.25h1.679q0.429 0 0.679 0.25t0.25 0.679v4.107q0 0.393 0.375 0.393h0.964q0.393 0 0.393-0.393v-4.107q0-0.429 0.25-0.679t0.679-0.25h1.732q0.429 0 0.67 0.25t0.241 0.679v4.107q0 0.393 0.393 0.393h0.982q0.375 0 0.375-0.393zM25.179 17.429v-2.75q0-0.946-0.589-1.527t-1.536-0.58h-4.714q-0.946 0-1.536 0.58t-0.589 1.527v7.321q0 0.375 0.393 0.375h0.982q0.375 0 0.375-0.375v-3.214q0.554 0.75 1.679 0.75h3.411q0.946 0 1.536-0.58t0.589-1.527zM27.429 6.429v19.143q0 1.714-1.214 2.929t-2.929 1.214h-19.143q-1.714 0-2.929-1.214t-1.214-2.929v-19.143q0-1.714 1.214-2.929t2.929-1.214h19.143q1.714 0 2.929 1.214t1.214 2.929z"></path> </symbol> <symbol id="icon-pinterest-p" viewBox="0 0 23 32"> <path class="path1" d="M0 10.661q0-1.929 0.67-3.634t1.848-2.973 2.714-2.196 3.304-1.393 3.607-0.464q2.821 0 5.25 1.188t3.946 3.455 1.518 5.125q0 1.714-0.339 3.357t-1.071 3.161-1.786 2.67-2.589 1.839-3.375 0.688q-1.214 0-2.411-0.571t-1.714-1.571q-0.179 0.696-0.5 2.009t-0.42 1.696-0.366 1.268-0.464 1.268-0.571 1.116-0.821 1.384-1.107 1.545l-0.25 0.089-0.161-0.179q-0.268-2.804-0.268-3.357 0-1.643 0.384-3.688t1.188-5.134 0.929-3.625q-0.571-1.161-0.571-3.018 0-1.482 0.929-2.786t2.357-1.304q1.089 0 1.696 0.723t0.607 1.83q0 1.179-0.786 3.411t-0.786 3.339q0 1.125 0.804 1.866t1.946 0.741q0.982 0 1.821-0.446t1.402-1.214 1-1.696 0.679-1.973 0.357-1.982 0.116-1.777q0-3.089-1.955-4.813t-5.098-1.723q-3.571 0-5.964 2.313t-2.393 5.866q0 0.786 0.223 1.518t0.482 1.161 0.482 0.813 0.223 0.545q0 0.5-0.268 1.304t-0.661 0.804q-0.036 0-0.304-0.054-0.911-0.268-1.616-1t-1.089-1.688-0.58-1.929-0.196-1.902z"></path> </symbol> <symbol id="icon-periscope" viewBox="0 0 24 28"> <path class="path1" d="M12.285,1C6.696,1,2.277,5.643,2.277,11.243c0,5.851,7.77,14.578,10.007,14.578c1.959,0,9.729-8.728,9.729-14.578 C22.015,5.643,17.596,1,12.285,1z M12.317,16.551c-3.473,0-6.152-2.611-6.152-5.664c0-1.292,0.39-2.472,1.065-3.438 c0.206,1.084,1.18,1.906,2.352,1.906c1.322,0,2.393-1.043,2.393-2.333c0-0.832-0.447-1.561-1.119-1.975 c0.467-0.105,0.955-0.161,1.46-0.161c3.133,0,5.81,2.611,5.81,5.998C18.126,13.94,15.449,16.551,12.317,16.551z"></path> </symbol> <symbol id="icon-get-pocket" viewBox="0 0 31 32"> <path class="path1" d="M27.946 2.286q1.161 0 1.964 0.813t0.804 1.973v9.268q0 3.143-1.214 6t-3.259 4.911-4.893 3.259-5.973 1.205q-3.143 0-5.991-1.205t-4.902-3.259-3.268-4.911-1.214-6v-9.268q0-1.143 0.821-1.964t1.964-0.821h25.161zM15.375 21.286q0.839 0 1.464-0.589l7.214-6.929q0.661-0.625 0.661-1.518 0-0.875-0.616-1.491t-1.491-0.616q-0.839 0-1.464 0.589l-5.768 5.536-5.768-5.536q-0.625-0.589-1.446-0.589-0.875 0-1.491 0.616t-0.616 1.491q0 0.911 0.643 1.518l7.232 6.929q0.589 0.589 1.446 0.589z"></path> </symbol> <symbol id="icon-vimeo" viewBox="0 0 32 32"> <path class="path1" d="M30.518 9.25q-0.179 4.214-5.929 11.625-5.946 7.696-10.036 7.696-2.536 0-4.286-4.696-0.786-2.857-2.357-8.607-1.286-4.679-2.804-4.679-0.321 0-2.268 1.357l-1.375-1.75q0.429-0.375 1.929-1.723t2.321-2.063q2.786-2.464 4.304-2.607 1.696-0.161 2.732 0.991t1.446 3.634q0.786 5.125 1.179 6.661 0.982 4.446 2.143 4.446 0.911 0 2.75-2.875 1.804-2.875 1.946-4.393 0.232-2.482-1.946-2.482-1.018 0-2.161 0.464 2.143-7.018 8.196-6.821 4.482 0.143 4.214 5.821z"></path> </symbol> <symbol id="icon-reddit-alien" viewBox="0 0 32 32"> <path class="path1" d="M32 15.107q0 1.036-0.527 1.884t-1.42 1.295q0.214 0.821 0.214 1.714 0 2.768-1.902 5.125t-5.188 3.723-7.143 1.366-7.134-1.366-5.179-3.723-1.902-5.125q0-0.839 0.196-1.679-0.911-0.446-1.464-1.313t-0.554-1.902q0-1.464 1.036-2.509t2.518-1.045q1.518 0 2.589 1.125 3.893-2.714 9.196-2.893l2.071-9.304q0.054-0.232 0.268-0.375t0.464-0.089l6.589 1.446q0.321-0.661 0.964-1.063t1.411-0.402q1.107 0 1.893 0.777t0.786 1.884-0.786 1.893-1.893 0.786-1.884-0.777-0.777-1.884l-5.964-1.321-1.857 8.429q5.357 0.161 9.268 2.857 1.036-1.089 2.554-1.089 1.482 0 2.518 1.045t1.036 2.509zM7.464 18.661q0 1.107 0.777 1.893t1.884 0.786 1.893-0.786 0.786-1.893-0.786-1.884-1.893-0.777q-1.089 0-1.875 0.786t-0.786 1.875zM21.929 25q0.196-0.196 0.196-0.464t-0.196-0.464q-0.179-0.179-0.446-0.179t-0.464 0.179q-0.732 0.75-2.161 1.107t-2.857 0.357-2.857-0.357-2.161-1.107q-0.196-0.179-0.464-0.179t-0.446 0.179q-0.196 0.179-0.196 0.455t0.196 0.473q0.768 0.768 2.116 1.214t2.188 0.527 1.625 0.080 1.625-0.080 2.188-0.527 2.116-1.214zM21.875 21.339q1.107 0 1.884-0.786t0.777-1.893q0-1.089-0.786-1.875t-1.875-0.786q-1.107 0-1.893 0.777t-0.786 1.884 0.786 1.893 1.893 0.786z"></path> </symbol> <symbol id="icon-hashtag" viewBox="0 0 32 32"> <path class="path1" d="M17.696 18.286l1.143-4.571h-4.536l-1.143 4.571h4.536zM31.411 9.286l-1 4q-0.125 0.429-0.554 0.429h-5.839l-1.143 4.571h5.554q0.268 0 0.446 0.214 0.179 0.25 0.107 0.5l-1 4q-0.089 0.429-0.554 0.429h-5.839l-1.446 5.857q-0.125 0.429-0.554 0.429h-4q-0.286 0-0.464-0.214-0.161-0.214-0.107-0.5l1.393-5.571h-4.536l-1.446 5.857q-0.125 0.429-0.554 0.429h-4.018q-0.268 0-0.446-0.214-0.161-0.214-0.107-0.5l1.393-5.571h-5.554q-0.268 0-0.446-0.214-0.161-0.214-0.107-0.5l1-4q0.125-0.429 0.554-0.429h5.839l1.143-4.571h-5.554q-0.268 0-0.446-0.214-0.179-0.25-0.107-0.5l1-4q0.089-0.429 0.554-0.429h5.839l1.446-5.857q0.125-0.429 0.571-0.429h4q0.268 0 0.446 0.214 0.161 0.214 0.107 0.5l-1.393 5.571h4.536l1.446-5.857q0.125-0.429 0.571-0.429h4q0.268 0 0.446 0.214 0.161 0.214 0.107 0.5l-1.393 5.571h5.554q0.268 0 0.446 0.214 0.161 0.214 0.107 0.5z"></path> </symbol> <symbol id="icon-chain" viewBox="0 0 30 32"> <path class="path1" d="M26 21.714q0-0.714-0.5-1.214l-3.714-3.714q-0.5-0.5-1.214-0.5-0.75 0-1.286 0.571 0.054 0.054 0.339 0.33t0.384 0.384 0.268 0.339 0.232 0.455 0.063 0.491q0 0.714-0.5 1.214t-1.214 0.5q-0.268 0-0.491-0.063t-0.455-0.232-0.339-0.268-0.384-0.384-0.33-0.339q-0.589 0.554-0.589 1.304 0 0.714 0.5 1.214l3.679 3.696q0.482 0.482 1.214 0.482 0.714 0 1.214-0.464l2.625-2.607q0.5-0.5 0.5-1.196zM13.446 9.125q0-0.714-0.5-1.214l-3.679-3.696q-0.5-0.5-1.214-0.5-0.696 0-1.214 0.482l-2.625 2.607q-0.5 0.5-0.5 1.196 0 0.714 0.5 1.214l3.714 3.714q0.482 0.482 1.214 0.482 0.75 0 1.286-0.554-0.054-0.054-0.339-0.33t-0.384-0.384-0.268-0.339-0.232-0.455-0.063-0.491q0-0.714 0.5-1.214t1.214-0.5q0.268 0 0.491 0.063t0.455 0.232 0.339 0.268 0.384 0.384 0.33 0.339q0.589-0.554 0.589-1.304zM29.429 21.714q0 2.143-1.518 3.625l-2.625 2.607q-1.482 1.482-3.625 1.482-2.161 0-3.643-1.518l-3.679-3.696q-1.482-1.482-1.482-3.625 0-2.196 1.571-3.732l-1.571-1.571q-1.536 1.571-3.714 1.571-2.143 0-3.643-1.5l-3.714-3.714q-1.5-1.5-1.5-3.643t1.518-3.625l2.625-2.607q1.482-1.482 3.625-1.482 2.161 0 3.643 1.518l3.679 3.696q1.482 1.482 1.482 3.625 0 2.196-1.571 3.732l1.571 1.571q1.536-1.571 3.714-1.571 2.143 0 3.643 1.5l3.714 3.714q1.5 1.5 1.5 3.643z"></path> </symbol> <symbol id="icon-thumb-tack" viewBox="0 0 21 32"> <path class="path1" d="M8.571 15.429v-8q0-0.25-0.161-0.411t-0.411-0.161-0.411 0.161-0.161 0.411v8q0 0.25 0.161 0.411t0.411 0.161 0.411-0.161 0.161-0.411zM20.571 21.714q0 0.464-0.339 0.804t-0.804 0.339h-7.661l-0.911 8.625q-0.036 0.214-0.188 0.366t-0.366 0.152h-0.018q-0.482 0-0.571-0.482l-1.357-8.661h-7.214q-0.464 0-0.804-0.339t-0.339-0.804q0-2.196 1.402-3.955t3.17-1.759v-9.143q-0.929 0-1.607-0.679t-0.679-1.607 0.679-1.607 1.607-0.679h11.429q0.929 0 1.607 0.679t0.679 1.607-0.679 1.607-1.607 0.679v9.143q1.768 0 3.17 1.759t1.402 3.955z"></path> </symbol> <symbol id="icon-arrow-left" viewBox="0 0 43 32"> <path class="path1" d="M42.311 14.044c-0.178-0.178-0.533-0.356-0.711-0.356h-33.778l10.311-10.489c0.178-0.178 0.356-0.533 0.356-0.711 0-0.356-0.178-0.533-0.356-0.711l-1.6-1.422c-0.356-0.178-0.533-0.356-0.889-0.356s-0.533 0.178-0.711 0.356l-14.578 14.933c-0.178 0.178-0.356 0.533-0.356 0.711s0.178 0.533 0.356 0.711l14.756 14.933c0 0.178 0.356 0.356 0.533 0.356s0.533-0.178 0.711-0.356l1.6-1.6c0.178-0.178 0.356-0.533 0.356-0.711s-0.178-0.533-0.356-0.711l-10.311-10.489h33.778c0.178 0 0.533-0.178 0.711-0.356 0.356-0.178 0.533-0.356 0.533-0.711v-2.133c0-0.356-0.178-0.711-0.356-0.889z"></path> </symbol> <symbol id="icon-arrow-right" viewBox="0 0 43 32"> <path class="path1" d="M0.356 17.956c0.178 0.178 0.533 0.356 0.711 0.356h33.778l-10.311 10.489c-0.178 0.178-0.356 0.533-0.356 0.711 0 0.356 0.178 0.533 0.356 0.711l1.6 1.6c0.178 0.178 0.533 0.356 0.711 0.356s0.533-0.178 0.711-0.356l14.756-14.933c0.178-0.356 0.356-0.711 0.356-0.889s-0.178-0.533-0.356-0.711l-14.756-14.933c0-0.178-0.356-0.356-0.533-0.356s-0.533 0.178-0.711 0.356l-1.6 1.6c-0.178 0.178-0.356 0.533-0.356 0.711s0.178 0.533 0.356 0.711l10.311 10.489h-33.778c-0.178 0-0.533 0.178-0.711 0.356-0.356 0.178-0.533 0.356-0.533 0.711v2.311c0 0.178 0.178 0.533 0.356 0.711z"></path> </symbol> <symbol id="icon-play" viewBox="0 0 22 28"> <path d="M21.625 14.484l-20.75 11.531c-0.484 0.266-0.875 0.031-0.875-0.516v-23c0-0.547 0.391-0.781 0.875-0.516l20.75 11.531c0.484 0.266 0.484 0.703 0 0.969z"></path> </symbol> <symbol id="icon-pause" viewBox="0 0 24 28"> <path d="M24 3v22c0 0.547-0.453 1-1 1h-8c-0.547 0-1-0.453-1-1v-22c0-0.547 0.453-1 1-1h8c0.547 0 1 0.453 1 1zM10 3v22c0 0.547-0.453 1-1 1h-8c-0.547 0-1-0.453-1-1v-22c0-0.547 0.453-1 1-1h8c0.547 0 1 0.453 1 1z"></path> </symbol> </defs> </svg> <script src="https://bilderupload.net/wp-content/cache/min/1/b5213d67c5d0e754e158f677f523f49c.js" data-minify="1" defer></script><noscript><link rel='stylesheet' id='wp-block-library-css' href='https://bilderupload.net/wp-includes/css/dist/block-library/style.min.css?ver=6.5.2' type='text/css' media='all' /><link rel='stylesheet' id='blog-tales-google-fonts-css' href='https://fonts.googleapis.com/css?family=Playfair+Display%3A400%2C700%7CLato%3A400%2C700&subset=latin%2Clatin-ext' type='text/css' media='all' /><link data-minify="1" rel='stylesheet' id='blog-tales-blocks-css' href='https://bilderupload.net/wp-content/cache/min/1/wp-content/themes/blog-tales/assets/css/blocks.css?ver=1706965618' type='text/css' media='all' /><link data-minify="1" rel='stylesheet' id='blog-tales-style-css' href='https://bilderupload.net/wp-content/cache/min/1/wp-content/themes/blog-tales/style.css?ver=1706965618' type='text/css' media='all' /></noscript></body> </html> <!-- This website is like a Rocket, isn't it? Performance optimized by WP Rocket. Learn more: https://wp-rocket.me - Debug: cached@1713618766 -->