@Admin said:
I’d love to help you with this, but I have no idea where to start. I have very little experience with Joomla and I’m not familiar with it’s tags functionality. If someone with more Joomla experience can help boil this down to a hosting issue, I can look into it.
With that in mind, you may wish to ask this question on something like StackOverflow or the Joomla forums instead. They’ll know more about Joomla and what could cause this issue. If it turns out to be a hosting problem, you can ask the question here again so we can address the hosting issue itself.
Fair enough, thanks for the clarification… It might be something to do with how google is crawling my site, I don’t have any idea whatsoever, but it’s saying to contact the system administrator, and that is you guys isn’t it… This is interesting… I am getting access denied, as in the image…
C) Access denied
Access denied means Googlebot cant crawl the page. Unlike a 404, Googlebot is prevented from crawling the page in the first place.
What they mean
Access denied errors commonly block the Googlebot through these methods:
You require users to log in to see a URL on your site, therefore the Googlebot is blocked
Your robots.txt file blocks the Googlebot from individual URLs, whole folders, or your entire site
Your hosting provider is blocking the Googlebot from your site, or the server requires users to authenticate by proxy
Are they important?
Similar to soft 404s and 404 errors, if the pages being blocked are important for Google to crawl and index, you should take immediate action.
If you dont want this page to be crawled and indexed, you can safely ignore the access denied errors.
How to fix
To fix access denied errors, youll need to remove the element that’s blocking the Googlebot’s access:
Remove the login from pages that you want Google to crawl, whether its an in-page or popup login prompt
Check your robots.txt file to ensure the pages listed on there are meant to be blocked from crawling and indexing
Use the robots.txt tester to see warnings on your robots.txt file and to test individual URLs against your file
Use a user-agent switcher plugin for your browser, or the Fetch as Google tool to see how your site appears to Googlebot
Scan your website with Screaming Frog, which will prompt you to log in to pages if the page requires it
While not as common as 404 errors, access denied issues can still harm your site’s ranking ability if the wrong pages are blocked. Be sure to keep an eye on these errors and rapidly fix any urgent issues.