The Invisible Foundations of Your Website
HTTPS, speed, robots.txt: your site's technical foundations determine whether Google can find it. If they're broken, nothing else matters.

The building analogy that explains everything
Imagine you've opened a beautiful shop on the high street. The window display is stunning, the products are excellent, the staff is friendly. But there's a problem: the front door is jammed shut. Customers can see your shop through the glass, but they can't get in.
That's exactly what broken technical foundations do to your website. You can have the best content, the most beautiful design, and the most compelling offer in your industry. But if the technical basics aren't right, Google can't properly access your site, AI can't read your content, and your customers experience a slow, frustrating mess that drives them away.
The good news: these technical issues are almost always fixable. And once they're fixed, everything else you do online works better.
HTTPS: the padlock that builds trust
Look at the address bar in your browser right now. You'll see a small padlock icon next to the web address. That padlock means the site uses HTTPS — a secure connection that encrypts data between the visitor's browser and the website.
If your site still runs on plain HTTP (no padlock), three things happen:
Browsers warn visitors away. Chrome, Safari, and Firefox all display warning messages on HTTP sites. Some show "Not Secure" in the address bar. Others display a full-page warning that asks the visitor if they really want to continue. Most people don't. They hit the back button and find a competitor with a padlock.
Google ranks you lower. Google confirmed years ago that HTTPS is a ranking factor. Sites without it start at a disadvantage. It's not the biggest factor, but when you're competing with similar businesses in your area, every advantage matters.
AI may skip your content entirely. AI crawlers from ChatGPT and Perplexity are cautious about citing insecure sources. If your site doesn't use HTTPS, you're less likely to appear in AI-generated answers. It's a trust signal, and you're failing it.
How to check
Type your web address into your browser. If it starts with "https://" and shows a padlock, you're fine. If not, your hosting provider can usually enable HTTPS for free using a service called Let's Encrypt. Most modern hosts offer this with a single click in their control panel.
A real-world example
A florist in Edinburgh had been running her site on HTTP for three years. She had lovely photos, detailed service descriptions, and strong Google reviews. But her site showed "Not Secure" to every visitor. After switching to HTTPS (which her hosting provider did in under an hour), her organic traffic increased by 23 percent over the following two months. Nothing else changed. Just the padlock.
Speed: every second costs you customers
When someone clicks on your website from a Google result, they expect it to load fast. Not in five seconds. Not in three. They expect it to feel nearly instant.
The numbers are stark:
- 1 second load time: around 7 percent of visitors leave before seeing anything
- 3 seconds: that jumps to 32 percent
- 5 seconds: more than half your visitors are gone
- 10 seconds: you've lost nearly everyone
These aren't hypothetical figures. Google's research consistently shows that speed is one of the strongest predictors of whether someone stays on your site or bounces back to the search results.
What slows your site down
For most small business websites, the culprits are predictable:
Oversized images. A single uncompressed photo from your phone can be 5 megabytes. A properly optimised version of that same image can be 200 kilobytes — 25 times smaller — while looking identical to the human eye. If your pages have multiple large images, this is almost certainly your biggest speed problem.
Too many plugins or scripts. Every widget, tracker, chat bubble, social media feed, and analytics tool you add to your site loads additional code. A site with 15 plugins will always be slower than one with 5. Audit what you actually use and remove the rest.
Cheap or overloaded hosting. If you're paying three pounds a month for hosting, your site shares a server with hundreds of other websites. When traffic spikes — say a competitor goes down, or you get shared on social media — the server struggles and your site slows to a crawl. Decent hosting doesn't have to be expensive. Between 10 and 30 pounds per month gets you reliable performance for a small business site.
No caching. Caching means your server saves a ready-made version of your pages instead of rebuilding them from scratch for every visitor. Without caching, your server does unnecessary work with every page view, which slows everything down. Most website platforms offer caching either built-in or through a simple add-on.
How to check your speed
Google provides a free tool called PageSpeed Insights. Enter your web address and it will score your site from 0 to 100, with specific recommendations for improvement. A score above 90 is excellent. Between 50 and 90 needs work. Below 50 is actively hurting your business.
A real-world example
A plumber in Manchester had a website that took 8 seconds to load on mobile. His bounce rate was over 70 percent — meaning seven out of ten visitors left without doing anything. The cause? Twelve high-resolution project photos that were each over 4 megabytes. After compressing the images and enabling caching, his load time dropped to 2.1 seconds. His bounce rate fell to 35 percent, and enquiry form submissions doubled within six weeks.
robots.txt: the file that tells Google where to look
Every website has (or should have) a small text file called robots.txt. It sits at the root of your site — yourdomain.com/robots.txt — and it tells search engines and AI crawlers which parts of your site they're allowed to visit.
Think of it as a sign at the entrance of a building. It might say "Welcome, explore freely" or it might say "Staff only beyond this point." Google and AI crawlers respect these instructions.
When robots.txt goes wrong
The most common problem we see is a robots.txt file that accidentally blocks everything. It looks like this:
User-agent: *
Disallow: /
Those two lines tell every search engine and every AI crawler: "Do not visit any page on this site." Your site effectively becomes invisible. Google won't index it. ChatGPT won't read it. Perplexity won't cite it.
This happens more often than you'd think. A web developer might add this line during development to prevent the unfinished site from appearing on Google, then forget to remove it when the site goes live. We've seen businesses run for months — sometimes years — with a robots.txt that blocks all search engines, wondering why they have zero organic traffic.
What a healthy robots.txt looks like
For most small business websites, robots.txt should be simple:
User-agent: *
Allow: /
Sitemap: https://yourdomain.com/sitemap.xml
This tells all crawlers they're welcome to visit any page, and points them to your sitemap (a list of all your pages) so they can find everything efficiently.
A real-world example
An estate agent in Bristol noticed that none of their property listings appeared on Google, despite the pages being live and well-written. The problem was a leftover robots.txt from the developer that blocked the entire /listings/ directory. After fixing that single file, Google began indexing their property pages within a week. Within a month, they were receiving organic traffic to individual listings for the first time.
Your sitemap: the map Google uses to find your pages
A sitemap is an XML file that lists every page on your site that you want Google and AI to know about. It's like giving Google a complete table of contents for your website.
Without a sitemap, Google has to discover your pages by following links from one page to another. If a page isn't linked from anywhere — or is buried deep in your site's structure — Google may never find it.
Why this matters for small businesses: If you have 20 service pages but Google has only found 8 of them, you're invisible for the services covered by the other 12. A sitemap makes sure nothing gets missed.
How to check
Type yourdomain.com/sitemap.xml into your browser. If you see an XML file listing your pages, you have a sitemap. If you get a "not found" error, you don't. Most website platforms can generate one automatically — in WordPress, plugins like Yoast or Rank Math create and update your sitemap without any manual effort.
Meta tags: the labels Google reads first
When Google visits one of your pages, the first things it reads are the meta tags — invisible labels in your page's code that describe what the page is about. The two most important ones are:
Title tag: This is what appears as the blue clickable link in Google results. If your title tag says "Home" or is blank, Google has no idea what your page is about, and neither does anyone searching.
A good title tag for a wedding photographer might be: "Wedding Photography in Surrey | Natural, Relaxed, Beautiful — Jane Smith Photography"
Meta description: This is the short text that appears below the title in Google results. It doesn't directly affect rankings, but it determines whether people click on your result or scroll past it.
A strong meta description for a physiotherapy clinic: "Expert physiotherapy in Leeds for sports injuries, back pain, and post-surgery recovery. Same-week appointments available. NHS and private patients welcome."
What happens when meta tags are missing
If you don't set these tags, Google guesses. It pulls random text from your page and uses it as the description. Sometimes this works out fine. Often it doesn't. You might end up with a Google result that shows your cookie policy text or a random sentence from your footer.
AI models also use these tags to understand your pages. A clear, descriptive title tag helps ChatGPT categorise your content correctly, which makes it more likely to cite you in relevant answers.
Mobile friendliness: most of your visitors are on their phone
In the UK, over 60 percent of web searches happen on mobile devices. For local businesses — restaurants, trades, shops — that number is even higher, often above 75 percent.
Google uses "mobile-first indexing," which means it evaluates the mobile version of your site first, not the desktop version. If your site looks great on a laptop but is a mess on a phone — with text too small to read, buttons impossible to tap, or content that requires sideways scrolling — Google notices, and your rankings suffer.
Quick mobile check
Open your website on your phone right now. Can you read everything without zooming? Can you tap every button without accidentally hitting something else? Does the page load in under three seconds? If the answer to any of these is no, you have a mobile problem that's costing you customers.
How these foundations work together
Here's what's important to understand: these technical elements aren't independent problems. They compound.
A site with no HTTPS, slow speed, a blocking robots.txt, and missing meta tags isn't just four problems — it's practically invisible. Google can't access it properly, can't trust it, can't understand what it's about, and can't deliver a good experience to its users. AI crawlers face the same obstacles.
Conversely, a site with strong technical foundations amplifies everything else you do. Great content performs better when it loads fast. Excellent service descriptions get found when Google can actually read them. AI cites your business more often when it can trust and access your pages.
The 10-minute technical health check
You don't need a developer to assess your site's technical health. Here are five checks you can do right now:
- HTTPS: Visit your site. Is there a padlock in the address bar? If not, contact your hosting provider.
- Speed: Run your site through PageSpeed Insights. Is your mobile score above 50?
- robots.txt: Visit yourdomain.com/robots.txt. Does it say "Disallow: /" anywhere? If so, that's blocking Google.
- Sitemap: Visit yourdomain.com/sitemap.xml. Does a sitemap exist? Does it list all your important pages?
- Mobile: Open your site on your phone. Is it easy to navigate and read?
If any of these checks reveal problems, fixing them should be your top priority — before investing in content, advertising, or design improvements.
Get a full technical health report
TryGEO checks all of these technical foundations automatically. It scans your HTTPS setup, measures your speed, reads your robots.txt, validates your sitemap, and tests your mobile experience. In under a minute, you'll know exactly what's working, what's broken, and what to fix first.
Because the best content in the world doesn't help if nobody can find it.
Check your visibility for free
Full diagnostic of your homepage in 30 seconds — free, no signup.
Test my siteAlexandre Aumont
Founder of TryGEO. Passionate about the web, artificial intelligence and chess.
April 3, 2026