Search engines can index websites without permission

How your new website is instantly indexed on Google

Do you want organic search traffic on your site? I bet the answer is yes - we all want this!

Organic search traffic is definitely important. According to, on average more than half of all website visits happen this way - up to 64% of your visits, compared to 5% via social media.

However, this is not of great importance if your page doesn't show up in search results at all.

How is your new page or blog included in the directory of Google, Bing or other search engines? Well, you have two options.

You can take the “turtle” approach - just sit back and wait for it to happen on its own.

Or you can try a little to make it happen right now. This gives you more time and energy to increase your conversion rate, improve your social signals - and of course to write great and good content and to advertise it.

I don't know about you, but I'd rather index my pages as soon as possible because it gives me more time to build a readership.

If “Ranking Your Pages” sounds good then read on and learn 11 simple steps you can do today to get your new page or blog indexed as quickly as possible.

Step 1: understand how Google and search engines work

Search engines rely on complicated algorithms in order to be able to fulfill their function. The basic process isn't that difficult to understand, though.

Basically, search engines like Google are based on so-called spiders - small pieces of computer code that every search engine sends out so that they “crawl through” the web (hence, spider = “spider”).

The task of the spider is to keep an eye out for new content on the web and to find out what it is about. For example, a new page on an existing website, changing an existing website or a completely new website or blog.

As soon as the spider has found a new page or webpage, it has to find out what the new page is about.

A long, long time ago, search engine spiders were nowhere near as clever as they are today. The only decisive factor for indexing and ranking was how often the particular expression that was being searched for appeared on the page.

And the keyword didn't even have to appear in the body of the page. Lots of people positioned their biggest competitor's brand name by simply populating the site's meta tags with dozens of variations of that brand name!

Fortunately, for Google search users and ethical website owners, those days are long gone.

Nowadays you are punished for “unnecessary keywords” and not rewarded. And keyword meta tags are not part of the algorithm at all (but there are good reasons to use them anyway).

If you are not careful, your entire page will be thrown out of the index, which means that your page can no longer be found in the ranking for any keyword.

nowadays Google is much more interested in the user experience on your site and in the intent behind the search -does the user e.g. want to buy something (commercial purpose) or learn something (informational purpose)?

Don't get me wrong, keywords continue to play an important role. Other factors are also important. According to Backlinko's Brian Dean, there are up to 200 factors in total, including high quality backlinks, social signals (but not direct), and valid code on all pages.

However, none of this matters if the spider doesn't tell the search engines that your pages exist. And that's where the indexing begins.

Indexing is the way in which the spider collects and processes data from the pages and web pages as it “crawls” through the web.

It recognizes new documents and changes, which are then added to Google's searchable index, provided that these pages contain high-quality content and do not sound the alarm bells by violating Google’s user-oriented goals.

The spider processes the content (text) on the pages and also the positioning of the search terms on the page. It analyzes title tags and alt texts for images.

This is what indexation consists of. So when the user of a search engine searches for information related to the same keywords, the Google algorithm goes into effect and decides how the page is ranked among all the other pages related to the same keywords.

But how do search engine spiders find new content - pages, websites or changes?

The Spiders start with pages that have already been indexed during previous "crawling sessions".

Then they add data from sitemap logs (we'll go into that in more detail later).

Eventually, the spider finds and uses the links on the pages it is "crawling around" and adds those linked pages to the list of pages it has yet to "crawl around".

This is the short and simplified version of how Google finds, analyzes and indexes new pages. Many other search engines proceed in a similar way, although the details can vary and each search engine has its own algorithm.

If you've recently published a new website, the first thing to do is check to see if Google has found it yet.

The easiest way to check is one site: Start searching in google. When Google knows your site exists and has crawled it, a list of results similar to the one for in the following screenshot will appear:

If Google hasn't found your page yet, no results will appear at all, like this:

Step 2: add a blog

Why do you need a blog?

It's easy: Blogs are hard working SEO machines. Blog content is crawled and indexed faster than static pages. In fact, blog websites get an average of 434% more pages indexed and 97% more links indexed.

Blogs also bring more traffic. According to HubSpot, companies that blog regularly attract 55% more visitors to their website than those that don't blog.

In addition, blogging works for every company, every industry, or niche as well as for almost all business models - even for B2C and e-commerce websites. For example, 61% of online shoppers actually bought something based on a blogger's recommendation.

Don't be afraid to get involved in a blog. Of course, it takes a lot of effort. You need to regularly write (or have been created) high quality, detailed blog posts. But I've found the rewards are well worth it.

And you don't have to blog every day - although 82% of marketers who blog daily report that they are generating customers with their blog entries.

If you have an ecommerce website, that doesn't mean blogging has to be extremely complex or difficult.

For example, if you're developing a new product page, write and post a blog post about that new product. Add some high quality photos of the product and link them to the product page. This helps the product page to be searched and indexed by search engines faster.

Step 3: Use Robots.txt

If you are not an experienced programmer or developer, you may have seen a “robots.txt” file between the files on your domain and wondered what it is and what it does.

“What is she” is very easy to answer. It is a basic, simple text file that should be kept in the root directory of your domain. If you are using WordPress, it is located in the root directory of your WordPress installation.

“What is she doing” is a bit more complex. In essence, robots.txt is a file that gives search engine programs clear instructions on which pages to search and index, and which pages to stay away from.

When search spiders find this file on a new domain, they first read the instructions it contains.If they can't find a robots.txt file, the search engines will assume that you want all pages to be crawled and indexed.

You may be wondering "Why in the world would I want the search engine to be one of my pages."Not indexed? " That's a good question!

In short, the reason is that not every page on your website should be counted as a separate page.

For example, let's say you have two pages with the same content. Perhaps it is because you are testing the visual characteristics of your design, but the content of the two pages is identical.

As you probably know, duplicate content is a potential problem in SEO. Because of that one solution is to use your robots.txt files to tell search engines to ignore this page.

The first step is to confirm that your new page has a robots.txt file. You can do this either with an FTP or by clicking on your file manager using the C-Panel (or the equivalent if your hosting provider doesn't use a C-Panel).

If there is no file, you can simply create one using a text editor such as Notepad.

Comment: It is very important to use a simple text editor, not something like Word or WordPad, because they insert invisible codes into your document that confuse everything.

WordPress bloggers can optimize their robots.txt files using reliable plug-ins like the Yoast SEO plug-in.

The format of a robots.txt file is pretty simple. The first line usually names a user agent, which is simply the name of the search program - e.g. Google bot or Bing bot. You can also use an asterisk (*) as a wildcard for all bots.

Next comes a sequence of allowed and not allowed commands to explicitly tell search engines which parts of your domain can be searched and indexed and which should be ignored.

In summary: The job of the robots.txt file is to tell the search engines what to do with the content of your website. But does it help you index your site?

Harsh Agrawal from ShoutDreams Media says:


He got pages indexed within 24 hours using a combination of strategies including robots.txt files and on-page SEO methods.

So in summary we can say thatIt is extremely important to be careful about revising the robots.txt file because it is easy to make mistakes if you don't know exactly what you are doing.

A file that has been configured incorrectly can hide your entire website from the search engines - which is exactly the opposite of what you want.

You could hire a skilled web designer to take care of the job and leave it to them so as not to take any chances.

You can also use the Google robots.txt tool to make sure the file is properly encoded.

Step 4: create a content strategy

In case I haven't said it often enough, let me tell you again: It is for your own benefit to put your content marketing strategy in writing.

If you don't believe me, maybe the Content Marketing Institute: "B2B marketers who have a written content strategy are more competent and have fewer difficulties with all aspects of content marketing."

In my experience, this is absolutely correct, but a documented content strategy will also help you index the pages on your website once you've created the content of the new pages.

According to HubSpot's State of Inbound 2014 report, content marketers say blogging generates 13 times the positive ROI when done right.

According to GrooveHQ's Alex Turnbull, that means:

Doing your best to post valuable, interesting, and useful content and then doing everything possible to make sure potential customers see it.

Here is an example: When I created and published a professional infographic on my website and then shared it on another page with a link back to my page, it was thanks to content marketing.

And since it's an infographic, it's likely that visitors on both sides will be engaging with it.

Other examples of off-page content that you can post and that will help you increase your traffic are:

  • Guest blog posts on other pages in your market niche.
  • Press releases shown on the sites that publish such content.
  • Articles on high quality websites with an article directory (Comment: Be careful - the vast majority of article directories are Nothigh quality and can damage your brand, reputation and SEO).
  • Videos on Vimeo or your YouTube channel.

Naturallyall content with your name and brand on it must be of high quality and published on a reputable and reliable site. Otherwise you will ruin your own intentions.

A piece of content published on a “spam” website with a link back to your page indicates to Google that your page is also spam.

A well-thought-out and documented content marketing plan will help ensure that you don't get into a rush just to get more content out. This gives you the reins in hand and allows you to concentrate on generating leads and increasing the conversion rate.

Creating a written content strategy doesn't have to be complicated or difficult. Just follow a framework:

  • What are your goals? Specify SMART goals and how you will measure your progress (e.g. graphs).
  • Who is your target group?Visitor profiles or roles are essential to understand your visitors and their wants / needs.
  • What content will you create? Again, you want to make sure you're delivering the content that your target audience would most like to see.
  • Where is it published? Of course you will place your own content on your new page, but you might also want to be present on other pages or use platforms such as YouTube, LinkedIn, and Slideshare.
  • How often will you post your content? It is far better to publish a well-written, high-quality article regularly, e.g. every week, rather than publishing nothing for a whole month.
  • Which system will you use to publish your content? Systems are really just repeatable programs and steps to solve a complex task. They will help you save time and write your content faster so that you stay on schedule. Anything that helps you publish content in less time and without sacrificing quality will improve your bottom line. This includes the blogging / content tools and technologies that you will be using and how they will fit into your system.

Once you have a documented content marketing plan, it will be easier for you to publish great content on a regular basis. It will help you index the pages of your new website faster.

Step 5: create and submit a sitemap

You've no doubt heard the word “sitemap” before - but you may never know what it meant exactly. Here is the definition Google gives us:

So, the sitemap is essentially a list (in XML format) of all the pages on your website. Their main job is to let search engines know when something has changed - either a new page or changes on a particular page - and how often the search engines should look for changes.

Do sitemaps affect your search rankings? Probably not - at least not substantially. But they will help you to get your website indexed faster.

In the Hummingbird Update-driven world of search, there are many SEO myths to be wary of. One thing stays the same though, great content is always on top, as is the cream on a cake.

According to the Google Webmaster BloghelpSitemaps mean that your great content is crawled and indexed so it can move up to the top of the SERPs faster. In Google’s own words, "Submitting a sitemap helps you ensure that Google knows about the URLs on your page."

Is there a guarantee that your website will be indexed immediately? No, but sitemaps are definitely a powerful tool to help you with this process.

And it may even help more than Google wants to admit. Casey Henry wondered to what extent sitemaps would affect searches and indexing. So he decided to do his own little experiment.

Casey spoke to one of his clients who had a fairly popular blog that he used WorldPress and the Google XML Sitemaps Generator plug-in for (more on that later).

With the customer's permission, Casey installed a tracking script that tracks the processes of the Google bot and also records when the program accesses the sitemap. This data was provided with time information, the IP address and the user agent.

The customer just kept going on their normal publishing schedule (around two or three entries per week).

For Casey, the results of his experiment were “incredible”. But see for yourself: If a sitemap was not submitted, it took Google 1,375 minutes on average to find, search, and index new content.

What if a sitemap was submitted? The average dropped to 14 minutes.

And the numbers from Yahoo! Search programs follow a similar trend.

How often should you tell Google to check for changes by submitting a new sitemap? There is no hard and fast rule. However, some content may require more regular search and indexing.

For example, if you are adding new products to an e-commerce website and each product has its own product page, then Google should check them regularly. The same goes for websites that regularly publish breaking news and breaking news.

However, there is a much easier way to create and submit a sitemap if you are using WordPress: Just install and use thatGoogle XML Sitemaps plug-in.

It's the same plug-in Casey Henry used in his case study.

In the settings, you can give the plug-in instructions on how often a sitemap should be created, updated and submitted to the search engines. It can also automate your process, that is, whenever you publish a new page, the sitemap is updated and automatically submitted.

Other sitemap tools you can use are the XML Sitemaps Generator, an online tool that works for any type of website, and Google Webmaster Tools, which you can take a more practical approach.

To use the Google Webmaster Tools, simply log into your Google Account, then add the URL of your new website to the Webmaster Tools by clicking on “Add a Property” on the right.

Enter the URL of your new website in the pop-up box and click “Next”.

Follow Google’s instructions to add an HTML file that Google will create for you. Connect your new site to your Analytics account or choose another option offered by Google.

Once your website has been added to the Google Webmaster Tools dashboard, click on the URL to go to that page's dashboard. Click on Sitemaps on the left under "Crawl", and then click on "Add / Test Sitemaps" in the right corner.

You can also use Bing's webmaster tools to do the same for Bing, as it's good to cover all of the basics.

Step 6: Install Google Analytics

You know you need access to basic analytical data for your new website, don't you? So why not start with Google Analytics and maybe - but just maybe - kill two birds with one stone?

Installing Google Analytics could wake up Google, that is, it lets Google know that your site is there. And this in turn could help trigger the crawling and indexing process.

Then you can move on to the more advanced tools of Google Analytics, setting goals and tracking conversions.

Step 7: Submit website URL to search engines

You can also take the direct approach andthe url of your page Submit to search engines.

Before doing this, you should know that there are many different opinions regarding submitting the page URL as a method of indexing a page.

Some bloggers say it is redundant, if not totally harmful. Since there are other efficient methods, most bloggers ignore this step.

On the other hand, it doesn't take long and it doesn't hurt either.

To submit the URL of your site to Google, simply log into your Google account and go to Submit URL in Webmaster Tools. Enter your URL, mark the box “I'm not a Robot” and then click “Submit request”.

To submit your page to Bing, use this link, which will also submit it to Yahoo at the same time.

Step 8: Create and update social profiles

Will you set up social media profiles for your new site or blog? If not, now is the time.

Why? Because search engines pay attention to social signals. These signals can potentially cause search engines to crawl and index your new page.

Social signals also help your pages rank higher in search results.

Google's Matt Cutts said a few years ago:

I recorded a video in May 2010 saying that I was not using “social” as a signal, and at the time, we were not paying attention to that signal, but we are recording it in December 2010, and use it.

It is now evident that a solid social media marketing plan is helpful for SEO. Social profiles on your website also give you another place to add links to your page or blog.

Twitter profiles, Facebook pages, LinkedIn profiles or company pages, Pinterest profiles, YouTube channels and especially Google+ profiles or pages - All of them are easy to create and they are great places to add links leading to your website.

If, for whatever reason, you don't want to create a new profile on social pages for your new page or blog, you can simply add a link from the new page to your existing profiles instead.

Step 9: Share the link on your new website

Another easy way to get links to your new page or blog is to update your own social status.

Of course, these are no-follow links, but they still work for indexing as we know that at least Google and Bing are tracking social signals.

If you're signed in to Pinterest, choose a good, high-quality image or screenshot for your new page. Add the URL and a good description (e.g. make sure you are using appropriate keywords for your site) and either pin it to an existing board or to a new one that you create for your site.

If you're signed in to YouTube, be creative! Record a short video introducing your site and highlighting its extras and benefits. Then add the url in the video description.

If you have an existing email list from another site that is connected to the same niche market as your new site,you can send an email to the entire list to introduce your new page, including a link.

Don't forget your email. Add your new URL and the page name to your email signature.

Step 10: Set up your RSS feed

What is RSS? And how does it affect indexing and crawling?

So, before we get to that, let's get one thing straight now: many think RSS is dead. In my opinion, this is not true, although it may not be evolving quickly and the number of users is steadily declining, especially after Google scrapped Google Reader in 2013.

But even Danny Brown, who wrote the last linked article that he calls RSS “Really So-Over-It Syndication,” has changed his mind a bit.

Generally speaking, RSS helpsIncrease readership and conversion rate, but it can also help you get your pages indexed. RSS stands for “Really Simple Syndication” or “Rich Site Summary” and is beneficial for both users and site owners.

RSS feeds provide a much easier way for users to consume a lot of content in less time.

Page owners benefit from the instant publication and distribution of the new content and the ability to allow new readers to subscribe to the new content once it is published.

If you have your RSS feed withFeedburner (Google's own RSS management tool), Google will be notified that you have a new page or blog that needs to be crawled and indexed.

RSS also lets Google know when you post a new post or page.

Step 11: Submit to Blog Directories

You probably already know that submitting your new URL to blog directories helps potential new visitors find your site.

But it also helps make indexing faster - if you do it right.

Long ago the digital landscape was littered with free blog directories. There were literally hundreds - if not thousands - of these pages, and far too many of them brought little or nothing to blog readers.

The quality problem was so bad that Google deleted many free page directories from the index in 2012.

Moz investigated this problem by analyzing 2,678 directories and concluded that only 94 out of 2,678 avoided directories - not too bad. There were, however, 417 other registers that were not avoided but were punished.

So what's the answer? When submitting your URL to directories, make sure that you are only submitting it to reputable and reliable directories.

Best-of lists of directories created by trusted blogs can help you separate the wheat from the chaff. But make sure the list you are using is up to date. For example, this one by Harsh Agrowal was updated in January 2015.

Other ways you might want to explore are TopRank, a long list of sites to which you can submit your RSS feed and blog; Technorati, one of the best blog directories ever; and - if you've published a decent amount of high quality content - the Alltop subdomain for your niche or industry.

Submitting your URL to high quality pages with a decent domain authority rating not only opens up your content to a whole new audience, but the action also delivers inbound links that can cause search engines to crawl and index your page .


So this is what it looks like - eleven ways to get your new page or blog quickly indexed by Google and other search engines.

This is by no means an exhaustive list. There are other methods that could help - for example, bookmarking with social bookmarking sites and StumbleUpon.

And as with most strategies and concepts related to content marketing, things change very quickly, especially when search engines are concerned. It is essential to keep up to date with industry news and to double-check new methods with your own, independent investigation.

Which crawling and indexing methods have you tried? What were your results?