News

Technical SEO A Comprehensive Guide Yemenat 2023

Technical SEO is a cure for the diseases of your site, and any disregard for these problems means losing the ranking of your pages in search engines, and this is very bad.

Let’s learn in this article about everything related to technical S

What is technical seo?

Technical SEO is simply dealing with everything related to the technical problems and errors of the site in order to ensure the stability of the ranking of your pages in search engines, and it is one of the main parts of SEO.

And the meaning of my words (technical website errors) is everything related to slow pages, code errors, and deletion of pages, as well as archiving errors, template problems, and many others.

Technical SEO factors: archiving

EO and how to solve your site problems.

Any owner of a website or blog always aspires to have his site quickly archived, and by archiving I mean, the appearance of your page in Google regardless of its location in terms of ranking.

When you write a new article and press the publish button, you must wait for Google spiders to crawl it, which is not an easy process that you imagine, especially if your site is newly created.

And even if Google spiders crawl the article, it does not mean that it will appear in Google. In order to improve the archiving process, you must take the following steps:

Add your site in google search console 

After you create your site, a reasonable number of articles must be published in order to register in the Google search console tool with a good balance.

I mean, imagine that you add your site to the tool and you do not have any content. Tell me, how will Google view your site? This is if it was found from the original!

Add a sitemap to your site

Technical SEO A Comprehensive Guide Yemenat 2023

Adding a map to your site means that you tell the crawling spiders the links on your site that allow them to be crawled, and this process somewhat facilitates the task for the Google spiders.

If your blog is on the WordPress system – and I hope it is – you can create a map of your site very simply, by means of a special plugin such as yoast seo, for example.

robots.txt file

Technical SEO A Comprehensive Guide Yemenat 2023

This file helps you to give commands and feedback to search engines about the pages you want to crawl, or the pages you want to prevent crawling. To make sure that this file is on your site, simply add robots.txt at the last link of your site, and the file will appear to you with the command line his own.

WordPress automatically creates a robots file as shown in the image above, and let me explain the meanings of those terms:

User-qgent : It is the type of spiders that allow or refuse to crawl your site’s pages. There are Google spiders, Yahoo spiders, and some SEO spiders. When you specify an asterisk (*), it means that the commands are directed to all crawling spiders.

Disallow : means prohibition orders.

Allow : It means permission commands.

In the example image above, all search engine spiders are instructed to ignore the wp-admin folder, allow the admin.ajax.php file to be crawled, and finally put your sitemap link.

I want you not to worry too much about this file, and it is enough for you to look like the image above to make sure that the situation is normal.

internal links

Internal links help the archiving process very much, as the internal link distributes the power from the old pages to the new pages, and does not make the latter an orphan.

Example :

I have 1 old and powerful article on: Learn SEO.

I have an article 2 on: internal seo.

I have an article 3 on: External SEO.

Finally I have this 4th article you are reading right now about: Technical SEO.

Article 1 gave strength to Articles 2-3-4.

Article 2 gave strength to Articles 3-4.

Article 3 gave strength to the new Article 4.

If Article 4 is published without any article referring to it, it will be difficult for crawling spiders to find it until they archive it.

Question: My site is new and I published my first article. Who will give it power?

Here you will market your article on social networking sites, make ads for it, or request archiving manually so that Google can see it.

Technical SEO Factors: Site Structure

Site structure or site structure is overlooked by many website owners despite its extreme importance. When you structure the general layout of your site, whether in terms of setting up the home page and categories, as well as internal linking, you give a good impression to the visitor on the one hand, and then make it easier for spiders to crawl, and this means quick archiving and great chances of being at the forefront.

Many bloggers create a rudimentary home page and assign categories as they please, and this is wrong.

Be clear with Google and do not tire crawling spiders.

SEO experts say, “ The depth of your site’s hierarchy should not exceed 4 degrees. ”

How is that ?

Let me give an example of my blog, The Art of Profit, and let’s see how the crawling spiders journey is.

The first thing that crawling spiders enter is the home page of the blog, and its depth is 0 .

On the home page, I put the comprehensive main article directories, and the services that are important to me, in addition to the classifications, and their depth is 1.

After that, it branches off from the main categories and articles, the Depth 2 group.

Finally, there are private tracking links in depth 3.

So the maximum depth that crawling spiders need to reach all my articles is only 2. In order to reach the article (profit from blogging), for example, you must pass through:

Home (depth 0).

Internet profit rating (depth 1).

– Profit from blogging article (depth 2).

Breadcrumbs breadcrumbs

Breadcrumbs are a tour guide for the visitor throughout your site, and they are one of the helping factors in structuring the structure of the site.

When you enter the home page and then choose a specific section, and from that section, you choose an article to read. The role of navigation paths here reminds you and guides you to the path you came from so that if you want to take a step back, you can do so with ease, as shown in the image above.

Technical SEO factors: duplicate content

Undoubtedly, duplicate content is bad for the visitor’s experience or for the search engines. To overcome this problem, the solution is very simple, which is: Be unique in your content.

Many blogs are flooded with repetitive articles on the same topic, and it is correct to cover the topic in one comprehensive article, and other articles close to it can be added to support it.

For example, in my blog, I talked about defining SEO in general and comprehensively, then I talked about internal SEO, external SEO, and technical SEO in a broader and more detailed way.

As for you talking about (the definition of SEO), for example, in most of the articles, this is a bad affectation.

Canonical to avoid duplicate links

We often find bad repetitions in-page links, especially in electronic stores.

Example :

Suppose I have a page for selling winter coats, and the link to this page is as follows:

fanribh.com/winter-coats

It is natural that there are several choices and colors for these coats. If you choose red or yellow, for example, the extension of the two links will be as follows:

fanribh.com/winter-coat-red

fanribh.com/winter-coat-yellow

This is a repetition of the link structure, so instead of excluding the archiving (noindex) of color-related links, we resort to the best solution, which is the work of Canonicals for these links, which is a directive to search engine spiders so that you tell them that the main link that you should crawl and archive is :

fanribh.com/winter-coats

As for all other links related to colors and shapes, they belong to the main link, and therefore are ignored.

Technical SEO factors: Speed

Good content doesn’t matter, perfect internal linking doesn’t matter, and tight site structure doesn’t matter if your pages are slow.

Let me ask you: What do you feel when you click on a page link and it takes a long time for it to open?

You will definitely get annoyed, and you will go back to Google and look for another page that opens quickly.

That is why Google officially adopted Core Web Vitals as one of the main factors in the lead, and it is an update that is concerned with providing a fast browsing experience for the visitor.

In order to make your site fast, you must focus on many things, the most important of which are:

hosting

You should be careful to choose a reputable hosting that is characterized by the quality of its services and the speed of its servers.

template

You should also be careful to choose a template with a fast and responsive design so that browsing is fast on the one hand, and then it facilitates the search engine spiders crawling of your site pages.

In the service of creating a private WordPress blog, I offer a paid professional template that takes into account speed and responsiveness.

I advise you to view this service if you do not have hosting and a blog yet.

Compress image size

There is no doubt that images are a vital element for the aesthetics of the article and for providing a useful reading experience for the visitor, but care must be taken to compress their size so as not to burden the page.

There are many plugins that compress image sizes, including Smush.

Add cache

Cache add-ons help greatly to speed up your site, as they save a copy of your site in the visitor’s browser, and if he returns to it, the site is called up more quickly than before, and these add-ons compress template codes and the rest of your site’s additions, in addition to many other advanced things.

And the best cash add-on ever is the famous wp rocket add-on, which is a paid add-on with a price of $ 49 per year for one site only, but you can get it for free in the WordPress blog creation offer that I mentioned earlier, or in my SEO course.

Technical SEO and its tools

There are many tools that check and detect errors on your site, and I will mention to you the most important tools that I personally use.

Webmaster Tool

The Google Search Console tool provides you with detailed reports on technical SEO problems, and this is related to:

Deleted page errors.

Crawl errors and archiving problems.

Sitemap status.

Server errors.

Speed ​​errors.

Schema condition.

Giant tool Screaming Frog

It is impossible to find a person who works in the field of SEO and does not know this giant tool that helps you to do an accurate and comprehensive examination of the health status of your site, so quickly solve these problems immediately.

This tool does the following:

Examination of the internal and external pages of the site.

Check the archive status.

Check site codes.

Check the security certificate.

Check the status of the robots.txt file.

Check sitemap.

Check the status of links.

Check titles and descriptions.

Check for duplicate content.

Examination of photos.

Examination of canonicals.

Schema check.

Speed ​​check.

Screaming Frog is free within the limit of 500 links checked, which is more than enough for microblogging.

Ahrefs Webmaster Tools SEO Checker

Technical SEO A Comprehensive Guide Yemenat 2023

In its free version, ahrefs offers a comprehensive technical SEO problem-checking tool that gives you a detailed report of all the internal and technical errors of your site and suggests the best solutions to overcome these problems.

ahrefs Webmaster Tools

زر الذهاب إلى الأعلى