14.01.2019 Views

Save the planet & reduce your ecological footprint with your website...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

PDFmyURL easily turns web pages and even entire <strong>website</strong>s into PDF!<br />

Login<br />

Sign Up<br />

Gautier Dorval<br />

Owner<br />

PAGUP, SEO AGENCY<br />

620, boulevard St-Joseph O, #1<br />

8193146831<br />

Contact Email<br />

Visit Our Website<br />

DID YOU KNOW that <strong>the</strong> web activity in terms of CO2 production (millions of<br />

servers, cooling systems, etc.) is equivalent to <strong>the</strong> production of CO2 from <strong>the</strong><br />

aviation industry worldwide. And GOOGLE is responsible for almost half of it (45%)<br />

... WHAT if you could maximize <strong>your</strong> <strong>website</strong>, <strong>reduce</strong> <strong>your</strong> site’s <strong>ecological</strong> <strong>footprint</strong><br />

and, at <strong>your</strong> level, <strong>reduce</strong> <strong>the</strong> greenhouse gas (CO2) production inherent to its<br />

existence on <strong>the</strong> Web (and make GOOGLE less. This optimization mission is <strong>the</strong> one<br />

that PAGUP, a Canadian SEO agency specialized in search engine optimization<br />

systems, has adopted by creating a “plugin” specifically dedicated to WordPress.<br />

10k downloads & 6 months later, this plugin has now become a standard for<br />

thousands of companies...<br />

PressReleasePing - January 13, 2019 -An idea, original, crazy<br />

some would say but which gradually, for more than 6 months, is<br />

making its way. How could a simple <strong>website</strong> help fight global warming?<br />

Better Robots.txt plugin for<br />

Wordpress<br />

To understand it, you have to immerse <strong>your</strong>self in <strong>the</strong> world of <strong>the</strong> Web, search engines and how <strong>the</strong>y work. Every day,<br />

new <strong>website</strong>s appear on <strong>the</strong> Internet that contribute to enriching, regardless of <strong>the</strong> content, <strong>the</strong> information available<br />

online. Thanks to search engines, whose mission is to crawl & index content in order to make it accessible online via<br />

simple requests, anyone, no matter where in <strong>the</strong> world <strong>the</strong>y are, can access <strong>the</strong> most varied content, on <strong>the</strong>ir computer<br />

or cell phone. This is <strong>the</strong> beauty of search engines, of which Google is <strong>the</strong> most effective representative.<br />

However, <strong>the</strong> process of crawling a site, which, it should be noted, is carried out continuously (all <strong>the</strong> time, or almost all<br />

<strong>the</strong> time) in order to make published information (including <strong>the</strong> most recent) accessible to anyone, is particularly energy<br />

consuming! This involves crawlers (spiders) visiting <strong>your</strong> <strong>website</strong>, from link to link, to detect <strong>the</strong> presence of new pages,<br />

content, changes to represent <strong>the</strong>m as accurately as possible in search engine result pages (SERP). As such, Google,<br />

which has become in a few years <strong>the</strong> most widely used search engine in <strong>the</strong> world, excels in its ability to provide <strong>the</strong><br />

most appropriate and relevant information according to <strong>the</strong> requests made by users. It is a clever mix of technology,<br />

algorithms, servers, power, etc. that allows <strong>your</strong> last article published on <strong>your</strong> site to be, in a few days, read, organized,<br />

referenced and made available in a few clicks to <strong>the</strong> first visitor interested in <strong>your</strong> subject.<br />

And this work is titanic. To give you an idea, Google conducts more than 3.5 billion searches per day on behalf of its<br />

users, making <strong>the</strong>m <strong>the</strong> main culprit, up to 40%, in <strong>the</strong> carbon <strong>footprint</strong> of <strong>the</strong> Web in general. In 2015, a study<br />

established that web activity in CO2 production (in terms of <strong>the</strong> use of millions of servers, cooling systems, etc.) was<br />

equivalent to <strong>the</strong> production of CO2 from <strong>the</strong> aviation industry worldwide.


PDFmyURL easily turns web pages and even entire <strong>website</strong>s into PDF!<br />

In fact, for <strong>your</strong> information, in <strong>the</strong> few seconds it took you to read <strong>the</strong>se first lines, Google will have already emitted<br />

more than 40 tons of CO2: http://www.janavirgin.com/CO2/<br />

That is 500Kg of Co2 produced per second….<br />

Edifying, right?<br />

Even though Google is aware of its carbon <strong>footprint</strong> and its founders implemented less energy-intensive processes<br />

early on in <strong>the</strong> development of <strong>the</strong>ir data centers, including investments in clean energy and numerous carbon offset<br />

programs, Google’s infrastructure still emits a significant amount of CO2. And unfortunately, growing every year.<br />

But what can you do about it? After all, you and <strong>your</strong> <strong>website</strong> cannot be held responsible for this. You are doing <strong>your</strong><br />

part and a priori, you have not asked Google for anything even if, in reality, you depend on it considerably (for traffic on<br />

<strong>your</strong> site).<br />

In fact, what you could do, which could impact globally (if all users did) <strong>the</strong> production of CO2 emitted by Google to read<br />

<strong>the</strong> Web, organize <strong>the</strong> information and allow users access to it, would simply be to simplify <strong>the</strong> work that Google has to<br />

do, through its indexing robots (crawlers), when <strong>the</strong>y visit <strong>your</strong> <strong>website</strong>.<br />

You may not know it, but <strong>your</strong> <strong>website</strong> is not limited to <strong>the</strong> pages you create <strong>with</strong> <strong>your</strong> content, nor to what is visible in<br />

search results. Your site contains an astronomical amount of internal links, intended strictly for its operation, to generate<br />

interactions between pages, to filter results, to organize content, to allow access to certain limited information (useful for<br />

<strong>your</strong> developer but not for <strong>your</strong> visitors), etc. And when <strong>your</strong> site is made available for crawling by search engines (so,<br />

concretely, when <strong>your</strong> site is published online), crawlers systematically try to visit all <strong>the</strong> links it contains in order to<br />

identify <strong>the</strong> presence of information, index it and make available. But, and this part is important, exploring/crawling <strong>your</strong><br />

<strong>website</strong> (which, let’s remember, is almost continuous) requires power, a lot of energy, both from crawlers (search<br />

engine servers) and also from <strong>your</strong> own hosting server.<br />

That’s why, very early on, in <strong>the</strong> continuous improvement processes of its crawling system, Google, for ex<strong>amp</strong>le,<br />

defined a limit to <strong>the</strong> number of links a robot could explore in a session. And <strong>the</strong> reason is simple. The power required<br />

by indexing robots to explore <strong>your</strong> site directly impacts <strong>the</strong> performance and efficiency of <strong>your</strong> <strong>website</strong>. In o<strong>the</strong>r words,<br />

<strong>your</strong> <strong>website</strong>, during <strong>the</strong> process in question, if <strong>your</strong> hosting is limited (which is very common), is slower to load <strong>the</strong><br />

pages and content it contains when visited (<strong>the</strong> processor and RAM being directly impacted).<br />

However, this limitation, called “crawl-budget” is not a general rule applied by all (known) search engines and certainly<br />

not by <strong>the</strong> thousands of “web” robots (“scrapers”) continuously visiting, copying, analyzing, <strong>the</strong> Web in general.<br />

Nowadays, more than 50% of <strong>the</strong> world’s traffic on <strong>the</strong> Web is generated by… robots. And not humans. So we are not<br />

alone.<br />

Last but not least, it is very common, sometimes due to <strong>the</strong> colossal size that some <strong>website</strong>s can have (online store,<br />

etc.) that crawlers are “blocked” in certain parts of a site. In o<strong>the</strong>r words, indexing robots crawl a huge amount of links<br />

(sometimes infinite, if <strong>the</strong>re are loops), non-essential, specific for ex<strong>amp</strong>le to online calendar features (where every day,<br />

month and year are pages, explorable) and from which it is no longer able to “leave”, due in particular to its limitation to<br />

<strong>the</strong> number of links it can explore, thus having <strong>the</strong> direct consequence of impacting <strong>the</strong> overall exploration of <strong>your</strong> site<br />

and finally, <strong>the</strong> accessibility of <strong>your</strong> important pages (articles, services, products, etc.). This may explain, for ex<strong>amp</strong>le,


PDFmyURL easily turns web pages and even entire <strong>website</strong>s into PDF!<br />

that even after several weeks, some recently published content is still not visible in <strong>the</strong> search results (crawlers, <strong>with</strong>out<br />

precise instructions, explore all <strong>the</strong> links it finds, even if <strong>the</strong>y have no interest whatsoever for you).<br />

So, if you could accurately tell crawlers, whoever <strong>the</strong>y are (Google, Bing, Yahoo, Baidu, etc.) what <strong>the</strong>y can explore and<br />

what is not necessary for <strong>your</strong> visibility, you could both ensure better performance for <strong>your</strong> <strong>website</strong> but ABOVE ALL,<br />

significantly <strong>reduce</strong> <strong>the</strong> energy (and <strong>the</strong>refore power consumption) required by <strong>your</strong> hosting server, Google and all<br />

o<strong>the</strong>r exploration entities on <strong>the</strong> Web.<br />

Admittedly, on an individual basis, this “optimization” represents only a small amount compared to <strong>the</strong> giants of <strong>the</strong><br />

Web. But if everyone, on a global scale, <strong>with</strong> <strong>the</strong> hundreds of millions of <strong>website</strong>s available, participated in <strong>the</strong><br />

movement, it would generate a real impact on <strong>the</strong> intangible that is <strong>the</strong> Web. And thus, <strong>reduce</strong> electricity consumption<br />

and ultimately CO2 emissions.<br />

So what can we do?<br />

This optimization mission is <strong>the</strong> one that PAGUP, a Canadian SEO agency specialized in search engine optimization<br />

systems, has adopted by creating a “plugin” specifically dedicated to WordPress (nowadays, nearly 28% of <strong>the</strong> Web is<br />

created from WordPress, 39% of all online shops on <strong>the</strong> Web are made from WooCommerce (WordPress) and<br />

WordPress represents almost 59% of <strong>the</strong> CMS market share worldwide) allowing FOR FREE, very simply and in a few<br />

clicks, <strong>the</strong> optimization of a file, called <strong>the</strong> Robots.txt.<br />

The… what?<br />

The “robots.txt”. As incredible as it may seem, this whole crawling operation is done through a small file that each<br />

<strong>website</strong> (no matter what it is) has on its root directory (on <strong>the</strong> hosting server). This file has only one simple role, that of<br />

communicating <strong>with</strong> search engines. In fact, it is so important that when <strong>your</strong> site is displayed in a browser for a visitor, it<br />

is <strong>the</strong> very first file that is loaded. Just like when indexing robots explore <strong>your</strong> <strong>website</strong>, it is <strong>the</strong> first file that will be<br />

searched first, <strong>the</strong>n read… to know exactly what to do <strong>with</strong> <strong>your</strong> site.<br />

So, this file, quite simple, is of great use. So much so that a single line can totally exclude <strong>your</strong> <strong>website</strong> from<br />

exploration by ALL search engines.<br />

Would you like to see <strong>your</strong> own content? To do this, simply go to <strong>your</strong> <strong>website</strong> and address bar, add “Robots.txt” after<br />

<strong>the</strong> “/” and <strong>the</strong>re you go!<br />

It’s in this file that everything is played out.<br />

It is precisely by inserting precise “instructions” that it is possible to inform crawlers as to what <strong>the</strong>y can read/index or<br />

not. The plugin in question, called Better Robots.txt, which had more than 10k downloads in 6 months, makes it<br />

possible, quite easily, to produce a robots.txt file optimized specifically for any WordPress site by refining <strong>the</strong><br />

indexing work that exploration robots will have to perform for most search engines and a large number of o<strong>the</strong>r entities.<br />

And it is free…


PDFmyURL easily turns web pages and even entire <strong>website</strong>s into PDF!<br />

You now have <strong>the</strong> opportunity to maximize <strong>your</strong> <strong>website</strong>’s exploration work, <strong>reduce</strong> <strong>your</strong> site’s <strong>ecological</strong> <strong>footprint</strong> and,<br />

at <strong>your</strong> level, <strong>reduce</strong> <strong>the</strong> greenhouse gas (CO2) production inherent to its existence on <strong>the</strong> Web.<br />

In any case, you & we all benefit.<br />

View Related News<br />

Category: Technology<br />

Region: North America<br />

Tags (meta-keywords): Wordpress, Environment, Ecological, Website<br />

This press release is licensed under a Creative Commons Attribution 3.0 Unported License.<br />

Terms of Service | Privacy Policy | Copyright Notice<br />

Copyright © 2017 PressReleasePing

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!