1. Keyword Research:
Keyword research for finding & analyzing better keywords related to website which can help content strategy as well as long tern marketing strategy. More ads mean higher commercial intent.
- Identify list of words relevant to business & making keywords universe.
- Opportunity Keyword” or “Related Keywords” or “LSI keywords”
- Autocomplete Keywords
- Questions Keywords
- Expand the list
- Prioritize list
- Categorize priority keywords
- Identify preferred landing pages
We are using tools to performing keywords research
- Google Keywords Planner
2. Keyword Gap Analysis
Analyzing & finding keywords that drive traffic to our competitors, but not to our website which helps valuable keyword opportunities that we are missing out on.
Keywords Gap Analysis
3. Backlink Gap Analysis
Analyzing & identifying website list which websites are linking to our competitors, but not to us.
Backlinks Gap Analysis
4. Analyze Google’s First Page
- Analyze Google’s first page using “Opportunity Keyword” or “Related Keywords” or “LSI keywords”
5. Voice Search Optimization Strategy
- Optimize Your Business Listings completely including your description, business hours, name, address, and phone number
- Apple Maps
- FAQ page
- Content Focus on Questions Phrases (like talking to a someone)
- Content Focus to Provide Answers
- Focus on using conversational (Voice queries contain conversational words and are longer)
- Use natural-sounding language
- Target long-tail keywords phrases
- Think about user intent
- Focus on action queries
- Tighten up your local SEO schema
- Website must be fast, mobile-friendly, and secure
- Describe the neighborhood (use the search query for “things to do near me”)
Notes: We use short phrases to find information by typing in search engine because it saves physical effort. If we use voice search, we will probably ask a question, like “Which restaurant is best?” So you need to think how your audience speaks about your business, products, and services.
Content is king. Your website is really just a wrapper for your content. Our website content tells prospects what we do, where we do it, who we have done it for, and why someone should use our business.
What & where we do it.
Why someone should engage with our business.
Optimize our marketing content including case studies, portfolio entries and testimonials.
7. Create Something Different & Better
“The Skyscraper Technique”
- We can create something different.
Like X ways to buy real estate property
How to buy property in noida?
- We can create something better.
8. Plan out Site’s Architecture (Invest in Web Design)
9. Use a Mobile-Friendly Design
10.Optimize For User Intent
- Graphs and Charts
- Stat as text
- Stat as graphic
- Blog Post Banners
12.Improving and Updating our Content
13.Increase Domain Authority
14.Build Links to Page
- Broken Link Building
- Referring domains
15.Building a Community on Site
Comments can help our rankings a community indirectly helps with SEO but blog comments may be direct Google ranking factor. Because of community members are more likely to share stuff on social media.
A webhook is a way for website send out notifications to your visitors once an event happens.
- We need to figure out why people link to content in our industry.
- We can create something new
- We can create case studies for getting links but need to feature a specific result.
Content syndication is when a piece of web-based content is republished (i.e. syndicated) by another website. Content syndication can be an excellent source of referral traffic.
Content doesn’t need to be visual to be effective. Podcasting is an audible form of content that is excellent for brand building and growing an audience.
19.Build an Opt-In Email List
20.Contests and Giveaways
21.Affiliate and Associate Programs
22.Maintain a Blog
24.Google Search Console
26.Bing Webmaster Tools
27. Technical SEO
Technical SEO can ensure that a search engine can read our content and explore our site.
Tools like Screaming Frog can explore your website and highlight technical problems.
Can a search engine explore our website?
Clear about which pages the search engine should index.
Site must be mobile friendly.
Page load times are a crucial factor.
Content structured on our website.
28.On-Page SEO Activities:
Shows search crawlers what a website is about. That’s help search engines to see that a page is a good search result because it is relevant and useful.
- Title Tag
- Meta Tag Optimization(Meta Description)(Unique Meta Descriptions)
- H1, H2 Tags (Headline Optimization)
- Call-to-actions (CTAs)
- Good user experience (UX)
- Social Proof
- Internal Linking (Linking to other pages on website)
- External Outbound Linking
- Keyword Analysis and Optimization
- URL Optimization
- Image Optimization
- Content Optimization (………………)
- OG Tags
- Canonical Tag
- 301 Redirects
- SEO Friendly URLs
- Landing Page Optimization
- Structured Data implementation and testing
- Responsive across various devices
- Sitemap.xml (Submit to Google Search Console)
- Sitemap.html (Optional)
- robots.txt file
- .htaccess file
29.Off-Page Optimization Activities/Authority Building/Link Building
Shows search crawlers that a website is important and valuable. That’s help search engines to see that a page is a good search result because the brand and website are authoritative and popular.
- Creating Shareable Content
- Guest Posting (Contribute as Guest Author)
- Link Building
- Image Submission
- Video Submission
- Document Sharing
- PPT Submission
- Forum Submission
- Event Submission
- Web 2.0 Submission
- Question and Answer
- Social Bookmarking
- Article Submission
- Search Engine Submission
- Social Media Engagement
- Bookmarking Submission
- Classified Submission
- Press Release Promotion
- Infographics Submission
- Local Directory Submission
- Local Listings & Yellow Pages
- Influencer Outreach
- Google My Business
- Hosting webinars (Webinars are basically online mini-courses.)
- Content Syndication
- Podcasting (We can create own podcast or appear on someone else’s podcast.)
Notes: Building authority, involves link building & links is a crucial part to developing strong organic rankings. Make sure building the kind of real links that make sense in the real world and won’t upset the qualitative and sometimes punitive parts of the algorithm. Content should deserve to rank and deserves to be linked to.
- Lead magnets
- A/B testing emails
- Google Ads
- YouTube Advertising
- Facebook Advertising
- LinkedIn Advertising
- Programmatic Advertising
Ways for successful local SEO
1. Optimize Our Google My Business Account
- Google My Business profile
- Google My Business image
- Google My Business Posts
- Accurate and up-to-date information
- Encourage our customers to review our business online & respond sincerely to reviews
- Include logo, hours of operation, acceptable payment methods, the product or service you sell.
Note: Google says it’s okay to ask your customers for reviews while Yelp actively discourages it.
2. Optimize for Voice Search
- Traditional question starters (who, what, when, where, why and how).
3. Create Content Based on Local News Stories or Events
4. Optimize Your Website for Mobile
5. Local Keywords
Your keywords should be relevant to local customers. It only makes sense, doesn’t it?
6. Dedicated Pages for Location/Product/Service
Location pages are a must if our business has more than one location in an area.
- Name, address, phone number & hours
- Individualized descriptions
- Google Maps attached to each location page
- Multiple locations must be to create unique content for each page
7. Online Business Directories
Online business directories like Yelp, Foursquare, MapQuest, and YellowPages etc
8. High-Quality Backlinks
Link signals (backlinks pointing to our site) are most important factors for snack pack or local pack results then for localized organic results.
Google uses following these metrics to measure authority & helpfulness
- Quality Content
- High Quality Links
- User Testing Score
- Domain Rank
- Content Age
- Social Signals
- Page Load Speed
- Internal Links
- Images & Videos
- Outbound Links
Search engines give links varying levels of value depending on the:
Linking site’s authority: Links from more authoritative sites have more value.
Relevance of the content on the linking page to the linked site: Links from pages that are relevant to your brand, industry, and linked content have more value.
Anchor text of the link: Links with anchor text that is relevant to your brand or content on the linked page have more value.
Number of links to the linking site: Links from sites that have a large backlink profile have a higher value.
Page Authority: Interlink low authority pages with high authority pages but must be a relevant pages.
Target Keyword: Target Primary Keywords for Each Page
Keyword-Rich Anchor Text: High authority pages use keyword-rich anchor text in your internal links
Semantic SEO: Optimize content for Semantic SEO
Related Keyword: Use related to target keyword & LSI keywords in contents
Keyword-Rich URLs: Short, Keyword-Rich URLs
Awesome Content: Make Content Look Awesome using Graphs and Charts etc.
Focus on gaining high-quality backlinks from authoritative sites, with relevant content and relevant anchor text.
Use of .htaccess file
- Creating SEO-Friendly URLs
- Overriding CMS URLs
- Removing .html and .php
- Adding rel=“canonical” to PDFs and Headers
- Developing Various Redirects
- Redirecting to the Latest Website Version
- Solving 404 Errors with 301 Redirects
- Redirect your older URLs to newer URLs
- Caching for Site Speed
- To adjust the expiring time
- Robot Directives
- Sending Spiders to Sitemap
- Error documents
- Password protection
- Deny visitors by IP address
- Deny visitors by referrer
- Hot link prevention techniques
- Blocking offline browsers and ‘bad bots’
- DirectoryIndex uses
- Adding MIME types
- Enable SSI with .htaccess
- Enable CGI outside of the cgi-bin
- Disable directory listings
- Setting server timezone
- Changing server signature
- Preventing access to your PHP includes files
- Prevent access to php.ini
- Forcing scripts to display as source code
- Ensuring media files are downloaded instead of played
- Setting up Associations for Encoded Files
- Preventing requests with invalid characters
Use of robots.txt file
Robots.txt file URL: www.example.com/robots.txt
Disallow crawling of the entire website.
Disallow crawling of a directory and its contents
Allow access to a single crawler
Allow access to all but a single crawler
Disallow crawling of a single webpage by listing the page after the slash:
Block a specific image from Google Images:
Block all images on your site from Google Images:
Disallow crawling of files of a specific file type (for example, .gif):
- Allowing all web crawlers access to all content
Using this syntax in a robots.txt file tells web crawlers to crawl all pages on www.example.com, including the homepage.
- Blocking a specific web crawler from a specific folder
This syntax tells only Google’s crawler (user-agent name Googlebot) not to crawl any pages that contain the URL string www.example.com/example-subfolder/.
- Blocking a specific web crawler from a specific web page
This syntax tells only Bing’s crawler (user-agent name Bing) to avoid crawling the specific page at www.example.com/example-subfolder/blocked-page.
4 Pillars of SEO:
- Technical SEO: How well your content can be crawled and indexed.
- Content: Having the most relevant and best answers to a prospect’s question.
- On-Page SEO: The optimization of your content and HTML.
- Off-Page SEO: Building authority to ensure Google stacks the deck in your favor.
When Should You Choose Canonical vs. 301 Redirect?
Scenario 1: Similar Products with Similar Descriptions in an Online Shop
This is an issue that many online shop owners and operators face. Given the vast inventory of many online stores, there are many similar or almost identical products (e.g. same color or model number). Hence, these products will have similar product descriptions but will be indexed with differing URLs.
By having an indication of the product in the URL, users know what to expect when clicking on the link. On the other hand, multiple URLs for similar products also lead to duplicate content in search engines.
The solution: To avoid duplicate content, it is important that only one URL is indexed by Google. The easiest way to do this is to use the canonical tag. This way, Google knows, which URL to index.
Example: There are two websites for similar products
Work smarter and boost your PPC performance.
Manage and optimise your online advertising with an award-winning platform. Eclipse your competition, automate your workload, and win with Adzooma.
Important: If possible, use only complete URLs within the canonical tag.
Scenario 2: A New Version of a Web Page Is Launched
You run a blog and have written an article that is now being updated and renewed. As a result, not only does the content change but the URL as well. To prevent losing existing traffic to the old article, the old URL is ideally redirected to the new one. The link strength will be almost entirely passed on. That is especially important if the first article has frequently been linked to by other websites.
The solution: A 301 redirect will securely redirect the old URL to the new one without creating any disadvantages. It is advisable to drive traffic through fresh links (internal and backlinks) to the new URL as well, so that search engines can replace the new URL with the old URL as soon as possible.
Scenario 3: Conversion of the Website from HTTP to HTTPs
Many websites are being converted to the encrypted https version. This increases the user’s security, and the change may give webmasters a ranking benefit. However, this conversion can lead to duplicate content, as Google indexes both the http and the https version.
To avoid this problem, a 301 redirect can be used once again. The http version of the website will be redirected entirely to the https version. Important: The permanent redirects may increase the loading time significantly for large sites.
The solution for smaller websites: All web pages are redirected to the https version via the 301 redirect.
The solution for very large websites: The rel=canonical refers to the https version as the original URL. This way, crawlers are encouraged to index the https version.
Scenario 4: A Product is No Longer Carried in the Online Shop
Product offerings in online shops regularly change. A product might be sold out, or is removed for other reasons. If the URL is deleted upon removal of a product, the user experience suffers as the server can no longer find the requested URL, and shows an error page instead.
The solution: To avoid weakening the user experience, a 301 redirect to a new URL may be helpful. That could be a redirect to the main category, the newer version of the product, or a similar product.
Yet, if a product is sold out temporarily, a 302-redirect could also be used. It notifies the user that the redirect is only temporary.
Suggestion: If you wish to deliberately display an error page, you should optimize your 404 page. When a user reaches this page, he not only receives an error message but also receives information on which products he can find instead. It is also possible to integrate a search function into the 404 error page.
Figure 1: Search function on a 404 error page at Airbnb.
Scenario 5: Relaunch of a Website with an URL Alteration
If a website is at the top of search engine rankings before its relaunch, website operators will want to maintain its position. However, that should not be taken for granted. If the new site is simply launched without having made any preparations, old URLs are no longer accessible. A search engine indexes the exact URL of an individual web page, and if the structure of the website is changed, almost none of the web pages will keep their URL.
The solution for the domain transfer: The old domain URL can be forwarded to the new domain URL via a 301 redirect. This way, users can be led directly to the new URL, without having to deal with error pages, or receiving browser messages that the complete website is unavailable. It is important to ensure all pages are correctly redirected.
This is why all of the “old” URLs of the website must be captured before the relaunch: To assign them to the new domain. It is also important to continuously monitor the redirects after the relocation. If necessary, webmasters can quickly react and minimize ranking losses.
The canonical tag and the permanent 301 redirect are powerful tools that are not only useful to avoid duplicate content but can also be used to improve the user experience. Knowing when to use which, will ensure user satisfaction and positive results in search engines.
You should avoid duplicate page titles (meta titles) on your website because the more duplicate content and duplicate page titles (meta titles) you have on your website, the worse those pages will rank in the search engine’s pagerank.
What exactly you should do about your duplicate page titles depends on the page content. Below are some examples and the solutions to fix the “duplicate page titles” error in SEO:
1. Two or more pages with the exact same page title and the exact same page content, but different URLs.
It sometimes happens that you want the same page in two different places on your website. Imagine that you have a product that you offer to your business customers and also to your private customers. Then you might want to put this page into two different locations/URLs on your website.
www.example.com/business/myproduct.html (Page Title: “My Product”)
www.example.com/private/myproduct.html (Page Title: “My Product”)
Both pages have a right to exist but Google will not know which one is the more powerful one (the original), so it will more or less split the link juice (ranking power) for the two pages. In this case you should use the rel=”canonical” tag. The rel=”canonical” tag should be put into the duplicate page and should point to the original page.
If the original from our examples above should be the page in the business folder, then the rel=”canonical” tag needs to be found in the page in the private folder (the duplicate page) and should look like this:
<link rel=”canonical” href=”www.example.com/business/myproduct.html” />
2. Two or more pages with the exact same page title but different content on the pages
If you have two pages with different content but the same page title you should think about giving the individual pages page titles that are more specific.
www.example.com/business/myproduct.html (Page Title: “My Product”)
www.example.com/private/myproduct2.html (Page Title: “My Product”)
Both of those pages have the same page title but they are actually not the same page. So you need to give one page the title “My Product” and the other one “My Product 2”. In some cases it will be obvious what a better page title will be, and there might be cases where finding a new page title needs some more thought.
2.1. All news pages have the same page title (News Detail | Company name)
On many websites you can see that all pages with a news article have the same page title
www.example.com/business/news.html?newsid=01 (Page Title: “News Detail | MyCompany”)
www.example.com/business/news.html?newsid=02 (Page Title: “News Detail | MyCompany”)
In this case your CMS will most likely create the page title on its own and call all news articles the same. This is not a good thing, firstly because of the lower pagerank for the individual pages as described in the beginning of this article, and secondly because you are missing out on the usage of keywords. You should contact your CMS provider or your agency and ask them to configure your CMS in that way that the article name (probably your H1 header) should be automatically set as page title.
Imagine if your pages had the article names as page titles:
www.example.com/business/news.html?newsid=01 (Page Title: “MyCompany expands group EBIT to €129.0 Million”)
www.example.com/business/news.html?newsid=02 (Page Title: “MyCompany gets an award from Gaultmillau”)
With those two page titles your website can be found for keywords such as “group EBIT” or “Gaultmillau award”. By having more page titles you will increase your chances of being found via different types of keywords.
3. The news archive has pagination pages that all have the same page title
Often you will see websites with news or image archives where you can just browse through current and old articles by clicking on so called “pagination pages” (/news/page/1; /news/page/2; /news/page/3). They will usually be embedded in the same content but by clicking on the 1, 2, 3, etc. you will see older articles or images. Search engines will see such pages as the same page, especially if they all have the same page title. That will decrease their link juice (ranking power) and possibly also have a slight negative effect on the rest of the pages on your domain. In this case you should use the noindex, follow-tag
<meta name=”robots” content=”noindex, follow” />
You start by researching popular trends, topics, and already well-received pieces of existing content across the topic areas your business typically covers. Then, you look for new and unique ways to create content that communicates a similar message — with a twist. This might mean that you leverage a new, more engaging medium, update the statistics, or employ a better design.
Once you’ve created a new and improved piece of content, reach out to the folks that have already linked out to similar content to put your piece on their radar … and hopefully earn a link.
This technique works really well for a few reasons:
- There’s already demand.
- You’re dealing with a primed audience.
- There’s serious ranking potential.
6-Step Checklist for Using the Skyscraper Technique
1) Research and uncover opportunities.
2) Build a list of potential distribution partners.
3) Create better content.
4) Promote to your audience.
5) Reach out to the right people.
6) Stay current.