Please enable JavaScript.
Coggle requires JavaScript to display documents.
SEO AUDIT ((HTTP Response (200 - :smiley: the page at the URL requested…
SEO AUDIT
HTTP Response
200 - :smiley: the page at the URL requested was successfully processed by the web server.
301 :smiley:Move permantly - SEO recommended then 302
302 :unamused: Correct to 301
404 :red_cross: page not found
503 :lock: - Web server unavailable
Robot.txt
Make sure we are not blocking any important page from being indexed
Find the file robot.txt in server' root directory - example.com/robot.txt
Site Map
Every time you make a change in the website resubit the sitemap
Robot Meta Tags - allow to restrict part of the page from being indexed
Site architecture
Step 1: Run Crawl Test
Links
External
Internal
Which links are erred?
Pages
Pages restricted from being indexed?
Are there pages whit disproportionate H1, H2, H3 tags?
Meta descriptions
Are there m.d. are exceeding the word limit?
Obsolate keywords? Replace them
Check Latentic Semantic Words LSI
Images
Which images are taking time to load?
Are all in jpg or web format?
Site Perfomance
Check current loading time
1,5 < loading time > 2 sec.
Use Google Page Speed
Current Search Ranking y each kws
Raplace kws with low potential
Content
Categorise content according the type of keyword
Goal
Awarness: The content provides knowledge to the user
Require action: content asks users to do an action
Check from where is coming the traffic
Are they in balance? The bots crawl the first 150 links.
Rectify the broken links
Define the level of friendliness for users and spiders bot.
How to check the level:
Have a count: How many click does it take a user to reah the deepest page of the website? Take a number and maintain it LOW
Follow a FLAT Website structuRe
Riorganise internal links
Goal: Understand what strategies are working as far as now.
Goal: Be sure the website is correctly indexed and which pages are not.
Step 2: Check for accessibility factors