blog posts

Google results

13 golden tips for optimizing your site in Google results

The first step in putting site optimization in Google results is to make sure Google can find our site. It is better to put the site pages in a sitemap. A sitemap is an XML file that notifies Google of new pages or changes to existing pages. Google identifies pages by tracking page links to each other.

1. Specify pages that should not be processed for Google.

Suggested ways to hide insensitive pages from Google’s robot

If URLs that Google should not have access to do not contain sensitive data, the robots.txt file can hide pages.

A robots.txt file tells Google not to search for site paths. This file should be named robots.txt at the root of the site.

Pages and URLs are hidden from Google in this way may continue to be tracked by Google bots. Therefore, do not use this method to hide security information.

In the Google Search Console, or Google Webmaster, a secure tool for creating a robots.txt file called robots.txt generator

Also, avoid:

  • Do not allow Google’s search pages to be processed by Google. Users hate being redirected from Google to another page to display search results.
  • Allow URLs to be searched in a proxy. (Content management software allows you to use different URLs to access different parts of the database. For example, you can create a new URL for each click on a link on a site. Each of these URLs Can be processed by Google.)

Note: If the site has a subdomain, you must create a separate robots.txt file for the subdomain.

2. What tools should we use to hide sensitive information?

Robots.txt is not a good way to hide sensitive information. The Robots.txt file only informs Google bots. Which parts of the content are not created for them. But this does not prevent the server from providing information at the user’s request.

It means that if the user enters the address of the desired section directly in the address bar, he will enter that section without any problem.

Search engines still have access to the addresses of these sections and are only barred from displaying content and titles. Suppose you do not want a page to appear in Google search results. But you do not care if the user is redirected to that page with a link. The Robots.txt file is great for this.

Example: Internal search page of the site: It is available through links within the site, but Google cannot see it. However, the information on this page lacks any sensitivity.

3. Help Google and users understand the content.

When a Google crawler processes a page, it should see the content users view on average. To make pages easier for Google to index, give Google access to JavaScript and CSS files and images. If the robots.txt file restricts access to these files, Google’s algorithm for processing the page content will have trouble. The results in lower search rankings or no indexing.

Suggested methods for easier page processing by Google:

1. With the help of Fetch as a Google tool on the Google Webmaster site, make sure that Google bots can process JavaScript, CSS, and site image files. With this method, many site problems can be easily identified and solved.

2. Check the contents of the robots.txt file in Google Webmaster

4. Create a suitable and unique title

The Title tag informs the user and Google of the page title. Choose a title where the user and Google notice a specific topic on the page. The Title tag must be inserted inside the Head tag. Each page should have only one Title tag, and each page title should be unique.

It means that two pages on the same site cannot have the same title.

5. Create great titles for your pages.

The page title will link you to the relevant page in search results. So choose your titles obsessively.

Tip 1: If you have multiple Title tags on a page, no error message will be generated. Rather, the displayed title will be a combination of these several titles. But for this issue, it is considered a negative score by Google.

Tip 2: The title of the first page in search results may contain values ​​other than the content of the Title tag. This content is created because of the connection of this page with all internal pages of the site.

Suggested methods for choosing the right title:

1. Try to describe the page’s content in the page title.

2. Choose a title that represents the page’s content for the average user.

Also, avoid:

  • Select a title that has no semantic relevance to the page content.
  • Select general titles for Untitled 1, New Page, Page One, Home Page.

Create a unique title for each page

Having the same title for multiple pages makes it difficult for Google to rank site pages for a common title.

Create concise but descriptive titles

The title can be both short and descriptive. If the title is too long, Google will break it. Google may display the page’s title differently depending on the search term.

Also, avoid:

  • Using very long titles is useless
  • Unnecessary use of keywords in the page title

6. Use of description meta

The meta description tells Google what the page content is. The title of the page contains only a few words, while the page’s description can be a whole sentence or two sentences or even a short paragraph. In Google Webmaster, you can get complete information about the appropriateness of the page description or its duplication.

7. What is the importance of meta descriptions?

Google may display Meta descriptions as page descriptions in search results. Google has no obligation to use meta descriptions in search results and may select another section of site content for that section.

If part of your text is more relevant to the user search phrase, that part will be used as a description, and also, if Google can not find some of your content that matches the search term. It most likely uses the same description meta.

8. Suggested methods for inserting appropriate meta descriptions:

Create relevant descriptions with content.

Write your description in a way that makes the audience interested in reading your content. Try to make the description text fit Google rules. Google accepts different description sizes for different searches. The safest 60-word description string is about 320 characters.

Also, avoid the following:

  • Write descriptions that have nothing to do with your content.
  • Use general descriptions such as This is a web page or about bicycles.
  • Fill in the description tag with a set of keywords
  • Copy all page content within the description section

9. Use unique descriptions for each page

Using unique descriptions and descriptions appropriate to the content of each page helps users and search engines to find the page they want more easily.

Tip: Avoid using a description for a large number of pages.

10. Use title tags for important texts

Title tags have larger fonts by default than regular text. This larger font allows the reader to notice that the text has subheadings related to the title. Using various title tags on the page creates a tree structure. This structure makes it easier for the user and search engines to read the site.

11. How can we easily choose the main and sub-titles?

Imagine for a second you were transposed into the karmic-driven world of Earl. An outline is a plan in which you specify the main and sub-titles. Then you create the content plan based on that.

Also, avoid the following:

  • Inserting text in the title does not help to identify the page structure.
  • Use title tags instead of Em or Strong tags
  • Randomly move from one title size to another: It is best to create titles from top to bottom from H1 to H6, respectively.

12. Use headlines widely across the page.

Use it when the title sounds logical. A large number of title tags on the page reduces the page’s readability. If there are too many titles, it is impossible to tell when one title ends and when the other begins.

Also, avoid the following:

  • Excessive use of title tags on the page
  • Very long titles
  • Use title tags to style the post, not to present the page’s structure.

13. Add Schema (product price description, working hours) to search results

Schema is a markup language that helps analyze a site’s information for search engines. With the help of these codes, search engines can display your site in search results in a practical and eye-catching way. It will help you to attract a real audience.

For example, if you have an online store and one of the product pages with this special method, the user can find out the price of the bike and the number of user comments from your description. The description of this section can be changed based on the search term, which is why this method is called rich results (rich results).

One of the uses of structured data markup is to include descriptions of store hours. You have seen examples of this application in Google Maps.

A lot of information can be provided about a business in this way. Example :

  • The goods you sell
  • Store location
  • A video about your product or business
  • working hour
  • List of events
  • Product preparation method
  • Company logo and

You can make search results much more attractive with the help of HTML or Data Highlighter codes (inserting stars, photos, etc., in Google search results).

Suggested ways to use this tool

Suppose you have added a structured data markup feature to your page. You can use Google Structured Data Testing Tool to test your code. This tool can either receive the page URL as a source or code copied to verify it.

Use Data Highlighter

If you want to structure your site in Google search results without changing the code, you can use Data Highlighter (one of the webmaster tools to determine how the site is in search results, with the ability to display asterisks). This tool is available for free in the Google search console or Webmaster.

By searching the Markup Helper tool, you can find ready-made codes to determine how the site is displayed in search results.