Wednesday 30 May 2018

Search Engine Optimization - Introduction, Need & Types

Introduction to SEO

SEO or Search Engine Optimization is the process of maximizing the visibility of websites in search engines, in the results(SERP) they generate and getting the optimum level of relevant traffic directed to one’s site, thus the word optimization is used.
It is common knowledge and now statistically proven that the higher up or earlier a website appears on a search engines results page, the more traffic it is likely to get from the users of that search engine. The second factor to influence traffic is frequency of appearance on the results page, or how many times a certain website appears on a search engine’s results page.
The primary or basic goal of search engine optimization is to maximize both these aspects of appearance to direct more traffic towards a website from a search engine.

Crawlers/Spiders: Search engines use internet bots called ‘crawlers’ or ‘spiders’ to index websites on the World Wide Web and keep their indexes updated. This process is called crawling or to put it in easier terms, crawling is the process of reading and cataloguing websites on the internet. In collaboration with these spiders, algorithms are written to generate results from the indexes created by these internet spiders. The process or methods through which, firstly, a website is made easier to index for a search engine’s spiders and secondly, made as responsive as possible to the algorithms of search engine, is what a large part of search engine optimization is all about.

What do you understand by term 'SEO',

SEO means search engine optimization. It is the process of getting traffic from the free, organic or natural search results on search engines like google, yahoo or Bing (we generally use google and Bing for our clients). We’re not paying anyone to pull people to the site/webpage.

Need of SEO

Billions of searches are conducted online every single day. This means an immense amount of specific, high-intent traffic.
Many people search for specific products and services with the intent to pay for these things. These searches are known to have commercial intent, meaning they are clearly indicating with their search that they want to buy something you offer. So how does Google determine which pages to return in response to what people search for? How do you get all of this valuable traffic to your site?
Google’s algorithm is extremely complex, but at a high level:
  • Google is looking for pages that contain high-quality, relevant information relevant to the searcher’s query
  • Google's algorithm determines relevance by “crawling” (or reading) your website’s content and evaluating (algorithmically) whether that content is relevant to what the searcher is looking for, based on the keywords it contains and other factors (known as "ranking signals")
  • Google determines “quality” by a number of means, A site's link profile(the number and quality of other websites that link to a page) and site’s content are among the most important
    Increasingly, additional ranking signals are being evaluated by Google’s algorithm to determine where a site will rank, such as:
  • How people engage with a site (Do they find the information they need and remain on the site, or do they "bounce" back to the search page and click on another link? Or do they just ignore your listing in search results altogether and never click-through?)
  • A site’s loading speed and “mobile friendliness”
  • How much unique content a site has (versus “thin” or duplicated, low-value content)
  • There are hundreds of ranking factors that Google’s algorithm considers in response to searches, and Google is constantly updating and refining its process to ensure that it delivers the best possible user experience.


    Types of SEO

    There are two types of SEO
    1. White hat SEO
    2. White hat SEO utilizes techniques and methods to improve the search engine rankings of a website which adhere to search engine(mainly google)guidelines.Some white hat SEO techniques include: high quality content development, website HTML optimization and restructuring, link acquisition campaigns supported by high quality content and manual research and outreach for Steady, gradual, but lasting growth in rankings.
    3. Black-Hat SEO
    4. There are unethical(banned) by google ways to get traffic to website, like using ‘n’ number of keywords on the site(cloaking), showing one content to the user and different one to user(Doorway pages), hiding text on the site by using same colors, etc. These might not look unethical to us but are a way to cheat the user so the site is banned and the search bot will not be able to crawl or index the pages. After that, we’ve to correct the mistake, then we might be able to rank again. It takes a lot of time and is bad for the company’s reputation results in Quick, unpredictable, and short-lasting growth in rankings.

    Conclusion

    There are hundreds of ranking factors that Google’s algorithm considers in response to searches, and Search engines are constantly updating and refining its process to ensure that it delivers the best possible user experience. We’ve to grab the pace and optimize accordingly for steady, gradual but long lasting growth in rankings

Wednesday 23 May 2018

In and out about Webmaster Tools - Search Console

Hey!! Let's discuss everything about Webmaster Tool, which is recently changed the name to Search console is generally used for frequent Analysis and checking the statistics of your website on webmaster tools.
Read about: Top 10 Bookmarking Sites [updated List]

Following are advantages of using Google Webmaster Tool:

1. You can Submit/check the sitemap for your website.
2. Adjust the crawl rate of the Google bots for your website and view the statistics.
3. Generate/check the robots.txt file for your website.
4. List the internal and external pages that link to your website.
5. Check what keyword searches led the site to be displayed in the Google search engine results pages and click-through rates for them.
6. Check statistics about how Google has indexed your website and whether it has found any issues while doing so.
7. Set a preferred domain name (e.g.domain.com over www.domain.com or vice versa) which will determine how the site URL will be displayed in search engine results pages. you need to confirm that you are the owner of the website and have full access to it. The process is absolutely automated and all you need to do is place a given text file in the web root directory for your website. provide a full list of the search phrases that result in your website being displayed on the search engine results pages all internal and external links to your website, keywords, errors and missing pages Here you will see various errors that Google bots encountered while trying to crawl your website. Also, each category will link to a page with more detailed information. Thus you can review and rectify the issues.

Elements of Google Webmasters Tool:

1. Search Appearance:
  • Structured data
  • Rich cards
  • Data highlighter
  • HTML improvements
  • Accelerated mobile pages(AMP)
2. Search Traffic
  • Search Analytics
The Keywords section will display the top keywords associated with your website. This is very important information for your website and how it is displayed in the search engine results pages.
  • Links to your Site
This area will display links that lead to your website. It also includes a link to an article where you can find more information why those statistics might differ compared to another tool.
  • Internal Links
  • Manual Action
  • International Targeting
  • Mobile Usability
3. Google Index
  • Index Status
  • Blocked Resources
  • Remove URLs
4. Crawl
  • Crawl Error
This feature helps detect types of errors.These are of two types: site errors and URL errors.

a)Site Errors
This effect the entire site(D.N.S, Server Connectivity, Robots.txt) & they prevent Googlebot from even requesting a URL. Be careful.

b)URL errors

URL errors are errors that are specific to a particular page.  It can show "not found, Access Denied, Soft 404 errors".
  • Crawl Stats
  • Fetch as Google
  • Robots.txt Tester
  • Sitemaps
If you have just started using the Google Webmaster Tools this area will allow you to upload a sitemap for your website.
  • URL Parameters
5. Security Issues
6. Web Tools

Conclusion: 
Google Webmaster Tool has its own set of advantages. Let it be site crawling detail or server error detail, It helps you with stats, linked pages, sitemap, robots.txt detail and much more.

Tips for making a Blog SEO Optimized with Proper Content

Every time we get a blog to optimize and post on a website, I see that the keywords are not in place or the headings doesn't contai...