How Do You Fix Issues in Your Site Audit Report?
After you have finished running your Site Audit, you will want to check your audit results in the report titled “Dashboard.” This will give you an evaluation of the on-page website health, including a list of issues that need your attention.
In addition to errors and warnings, your report may also contain “Tasks” that vary in terms of their severity. The information contained there can be valuable for improving your site and it’s recommended to start with the “High Priority” tasks for better results.
This article will explain what notices appear in your site audit report and how you can fix them with proper solutions. Let’s get started.
How Do You Fix Notices?
Here is what a site audit report looks like:
Here are some of the notices that can come up after running a site audit:
- Problem: URLs longer than 200 characters
Lengthy URLs are not friendly to search engine optimization. Users are put off by URLs that are too long, and as a result, they are less likely to click on them or share them. This, in turn, hurts the click-through rate and usability of your page.
Solution: Keep your URLs to a length that is appropriate for the site.
- Problem: Subdomains don’t support HSTS
Web browsers are instructed by HTTP Strict Transport Security (HSTS) to use only HTTPS connections when interacting with servers. So, watch out for insecure stuff and avoid giving it to your readers.
Solution: Make sure your server is HSTS-compatible.
- Problem: Pages with more than one H1 tag
Although HTML5 allows for more than one <h1> tag per page, it is recommended that you do nott use more than one. Adding several <h1> tags at once can be confusing to readers.
Rather than using a single <h1> tag, use several <h2>-h6> tags.
- Problem: Pages that were blocked from crawling
If a page is inaccessible to search engines, it will never appear in the list of search results. Either a robots.txt file or a noindex Meta tag could be responsible for blocking a page from crawling
Solution: Ensure that pages containing valuable content have not been mistakenly blocked from the crawling process.
- Problem: Robots.txt not found
The effectiveness of your site’s SEO can be greatly affected by modifying the contents of the robots.txt file. Using this file, search engines can better understand which parts of your site to crawl. You can decrease the time it takes for search engine robots to crawl and index your site by using a robots.txt file.
Solution: Create a robots.txt file if you don’t want certain parts of your website to be indexed. Use the robots.txt Tester in Google Search Console to ensure your file is configured properly.
- Problem: Links with no anchor text
If a hyperlink on your site uses a blank piece of text (known as a “naked anchor“) as its anchor text or the text itself consists entirely of symbols, you may run into this problem. Users and crawlers can still follow a link even if it lacks an anchor, but they will find it difficult to determine what the linked page is about.
When crawling and indexing a page, Google also considers the anchor text used in the links between pages. The absence of an anchor means a missed chance to improve the search engine rankings of the linked-to page.Solution: When creating a link, include appropriate anchor text. For consumers and search engines to understand what the linked website is about, the anchor text used in the link itself must provide some context. Use concise language that still conveys the necessary information.