Skip to main content
Part I — FoundationChapter 3 of 23Free preview2 min read

The Technical Foundation Layer™

What you'll learn ▸
  • Pass the Core Web Vitals (LCP, INP, CLS) benchmarks that directly influence rankings
  • Set up XML sitemaps, robots.txt, canonical tags, and mobile-first indexing correctly
  • Audit your technical foundation in under two hours using four free tools

Chapter Overview Every strategy in this playbook — from the Service Authority Stack to Answer Engine Optimization — depends on a technical foundation that allows search engines and AI systems to access, crawl, and interpret your content efficiently. If your website is slow, difficult to navigate on mobile, or structured in a way that confuses search engine crawlers, even the best content will underperform. This chapter is not designed to turn you into a web developer. It is designed to give you enough understanding of the technical layer to ensure your foundation is solid — and to know what to ask for when working with a developer or agency.

Core Web Vitals

Google publicly measures three performance metrics that influence how your site is evaluated:

Largest Contentful Paint (LCP)

This measures how quickly the main content of your page becomes visible. A good LCP is under 2.5 seconds.

Interaction to Next Paint (INP)

This measures how quickly your site responds when a visitor interacts with it. A good INP is under 200 milliseconds.

Cumulative Layout Shift (CLS)

This measures visual stability — whether elements on your page jump around as it loads. A good CLS score is under 0.1. You can check all three metrics using Google's free PageSpeed Insights tool. Focus on mobile first — that is where most service searches happen.

Mobile-First Indexing

Google primarily uses the mobile version of your website to determine rankings. Test your website on your own phone regularly. Can I find the phone number within three seconds? Can I navigate to any service page within two taps? If the answer to any of these is no, mobile usability should be your first technical priority.

Crawl Efficiency

XML Sitemaps

An XML sitemap tells search engines which pages exist on your website. Submit your sitemap through Google Search Console and review it periodically.

Robots.txt

Your robots.txt file tells search engines which areas of your site they should not crawl. An incorrectly configured robots.txt can block important pages from being indexed.

Canonical Tags and Location Page Duplication

When you create multiple location pages, canonical tags help prevent duplicate content issues. Pages with 80 percent identical content and only a swapped city name will likely be filtered by Google.

Google Search Console Essentials

Google Search Console is a free tool that every service business should have configured. Key areas to monitor include your indexing status, your search performance report, any manual actions flagged by Google, and your Core Web Vitals report. Great Content On A Broken Foundation Is A House Built On Sand.

PART II Content Authority

Up next

Part II — Chapter 4 — The AI Citation & Answer Engine Method™

Continue