#1 Pick
Best Hosting WordPress Hosting VPS Hosting Reviews Guides Coupons See #1 Pick →
Full Transparency

How We Test: The Methodology Behind Every Recommendation

Most hosting review sites never buy a single plan. We buy every account with our own money, test for 6+ months, and publish raw benchmark data. Here's exactly how every test, score, and ranking on ThatMy.com is produced — step by step, with real examples.

Real Accounts Only 6-Month Min Test No Vendor Perks
Mangesh Supe
Mangesh Supe
Hosting Expert & Benchmarker

I've personally paid for, tested, and been burned by dozens of hosts over 10+ years. I currently run my own sites on ScalaHosting. I earn commissions when you use my links — but I call out bad hosts I could profit from because data beats commissions every time.

200+
Plans Tested
6 mo+
Per-Host Testing
250
Concurrent Users
1,190
CPUs Ranked
$0
Vendor Sponsorships
Why This Page Exists

Most Hosting Reviews Are Based on Nothing. Ours Are Based on This.

I'm going to be blunt: the majority of hosting review websites on the internet have never purchased a single hosting plan. They rewrite the host's marketing page, add an affiliate link, and call it a "review." No tests. No data. No accountability.

I know this because I've read hundreds of competitor reviews while building ThatMy.com. I can tell in seconds which reviewers have never logged into a cPanel dashboard, never SSH'd into a server, never run a TTFB test. Their reviews say things like "blazing fast servers" and "excellent uptime" without a single number to back it up.

This page exists to show you — in uncomfortable detail — exactly what I do differently. Not because I think you'll enjoy reading about my testing methodology (you probably won't). But because I want you to understand why my recommendations look different from every other site.

When my #1 pick is different from what 50 other blogs recommend, you should know why. The answer is always data. This page shows you that data, the tools that produced it, and the process that keeps it honest.

Here's a quick example of why methodology matters:

What Most Review Sites Do

Read the host's marketing page. Copy the feature list. Add affiliate link. Say "great for beginners!" Collect $150 commission. Never buy the plan. Never test the server. Never read the TOS.

What ThatMy.com Does

Buy the plan at retail price. Install WordPress. Run TTFB tests with no CDN. Stress-test at 250 concurrent users. Identify the CPU model. Look up PassMark ranking. Read the full TOS. Search TrustPilot for complaints. Publish the raw numbers.

The result? My rankings often disagree with the "consensus." Bluehost is #1 on most review sites. On ThatMy.com, it's on the avoid list. Because when you actually test it — 480ms TTFB, 306% renewal price increase, 4,200+ 1-star TrustPilot reviews — the data tells a very different story than the marketing.

Below is every step of my process, in order, with real examples from actual tests I've run.

Step 1 of 8

We Buy Real Hosting Plans (With Our Own Money)

Every host reviewed on ThatMy.com is purchased with my own credit card at retail pricing. I don't use vendor-provided "press accounts," free review accounts, or demo environments that could have boosted performance. The plan I test is the exact same plan you'd buy if you clicked "Sign Up" right now.

This matters more than you think. Multiple hosts have offered me free accounts for review purposes. I've turned down every single one. Here's why:

The free account problem: When a hosting company gives a reviewer a free account, they know it's being reviewed. They can provision it on their fastest server, assign dedicated resources, or give it priority in the load balancer. The reviewer sees 50ms TTFB and writes a glowing review. You sign up and get 300ms TTFB because you're on a crowded shared server with 200 other accounts. The review was technically accurate — for the server they cherry-picked.

By purchasing at retail price through the normal checkout flow, I get the same experience you get. Same server assignment. Same resource allocation. Same support queue. If the checkout process has confusing upsells, I experience them. If the onboarding is slow, I experience that too.

Purchased at retail price — no free or sponsored accounts, ever
Standard plans (not enterprise, agency, or "reviewer" tiers)
Default server location — no cherry-picked data centers
WordPress 6.5+ installed with default theme and demo content
No special requests to support (no "please put me on a fast server")
200+
Hosting Plans Tested
100+
Hosting Companies Evaluated
ISP-Level
Infrastructure Background

What about the cost? Maintaining active hosting test accounts isn't cheap. I currently have active accounts on ScalaHosting, Cloudways, Kinsta, Hostinger, Contabo, ChemiCloud, and several others — costing roughly $200/month in hosting bills alone. These accounts exist solely for testing. I fund this through the affiliate commissions I earn when readers sign up through my links. It's a self-sustaining cycle: honest reviews → reader trust → affiliate revenue → more test accounts → more honest reviews.

Some hosts I've tested and subsequently closed the account (GoDaddy, Bluehost, HostGator). For these, I note the test date in my reviews and flag when data might be outdated. I'll reopen accounts and re-test if I have reason to believe the host has significantly changed their infrastructure — which happens more often than you'd think.

Step 2 of 8

Standardized WordPress Setup (Same Config, Every Host)

The single biggest flaw in most hosting benchmarks is uncontrolled variables. If you test Host A with a lightweight theme and Host B with a bloated page builder, the difference you measure is the theme, not the server. Your benchmark is useless.

Every host I test gets the exact same WordPress installation:

WordPress version: Latest stable release (currently 6.5.x)
Theme: Twenty Twenty-Four (default WordPress theme, no customization)
Content: WordPress Theme Unit Test Data (standardized demo content with posts, pages, images, menus, and comments)
Plugins: None. Zero. Not even a security or caching plugin
CDN: Disabled. No Cloudflare, no host-provided CDN, no external caching
Server-side caching: Host default only (LiteSpeed Cache, Varnish, or whatever the host pre-installs)

Why no CDN or caching plugins? Because I'm testing the server, not the caching layer. A CDN can mask a slow server by serving cached static files from edge nodes. That's great for production websites — but it tells you nothing about the underlying server performance. When you run WooCommerce, process forms, or access your admin dashboard, those requests bypass the CDN and hit the actual server. That's the speed that matters, and that's what I measure.

The caching trap: I've seen hosts advertise "50ms response time" that's actually measuring Cloudflare's edge cache, not their server. Remove Cloudflare and the same host clocks in at 400ms. My tests always measure the server without CDN masking. When you see "28ms TTFB" in a ThatMy.com review, that's the raw server speed.

I do note what server-side caching the host provides by default (LiteSpeed Cache on LiteSpeed hosts, Varnish on some VPS platforms, OPcache settings) because this is part of the hosting product. But I never add external caching that you'd have to install yourself — because then I'd be testing the plugin, not the host.

Example: Why Standardization Caught Hostinger Off-Guard

In early 2025, I tested Hostinger's Premium plan with this standardized setup. Their TTFB came in at 182ms — decent, but far from their advertised "fastest hosting." Hostinger's marketing page shows speed test results taken with LiteSpeed Cache fully tuned, Cloudflare CDN active, and an optimized theme. My test strips all of that away. The 182ms is what your server actually does before any optimization.

Is 182ms bad? No. It's acceptable for shared hosting. But it's not the "30ms" their marketing implies, and it's significantly slower than ScalaHosting's 28ms VPS or ChemiCloud's 95ms shared hosting. Context matters. Raw numbers don't lie.

Step 3 of 8

TTFB & Speed Testing (The Most Important Number)

Time To First Byte (TTFB) is the single most important hosting speed metric. It measures how quickly the server responds to a request — the time from your browser sending a request to receiving the first byte of the response. This includes DNS resolution, TCP connection, TLS handshake, and server processing time.

TTFB tells you how fast the server is, independent of your website's code, theme, or content. A 50ms TTFB means the server is fast. A 500ms TTFB means the server is slow. No amount of image optimization, minification, or lazy loading can fix a slow server — because TTFB happens before any content is even delivered.

How I test TTFB:

Tool: curl with timing breakdown (curl -o /dev/null -s -w "TTFB: %{time_starttransfer}\n")
Locations: US East (Virginia), US West (Oregon), EU (Frankfurt), Asia (Singapore)
Timing: Tests run at multiple times of day (morning, afternoon, midnight) over 7+ consecutive days
Method: 50 requests per test session, median value used (not average — outliers don't skew the result)
CDN: Disabled — raw server speed only
Caching: First-byte timing (not full page load) — measures actual PHP/MySQL execution

Why median instead of average? Because one slow response out of 50 can spike the average by 100ms. The median gives you the "typical" experience — what you'd feel when loading the site on a normal visit. I report both median and P95 (the 95th percentile — the speed at which 95% of requests are faster) so you can see both the normal case and the worst case.

Real TTFB Results — US East (Virginia) · 2026 Data
ScalaHosting (Managed VPS)28ms
Kinsta (Managed WordPress)62ms
ChemiCloud (Shared)95ms
Cloudways (DigitalOcean)110ms
Hostinger (Premium)182ms
SiteGround (StartUp)198ms
A2 Hosting (Startup)245ms
Bluehost (Basic)480ms
HostGator (Hatchling)520ms
GoDaddy (Economy)620ms

Look at that table. ScalaHosting at 28ms. GoDaddy at 620ms. That's a 22x difference in server response time. Both are hosting WordPress. Both are sold as "fast hosting." But one responds before you blink and the other takes longer than a Google search to process a single request.

This is why TTFB testing matters. Without it, you're relying on the host's marketing claim of "fast servers" — which every host claims, including GoDaddy.

A note on TTFB variability: TTFB isn't constant. It varies by time of day, server load, and even which physical server within a data center you're assigned to. That's why I test over 7+ days at different times. A single TTFB measurement is a snapshot. My averages represent hundreds of data points.
28ms
Fastest TTFB (ScalaHosting VPS)
95ms
Fastest Shared (ChemiCloud)
620ms
Slowest TTFB (GoDaddy)
Step 4 of 8

Load Testing — What Happens When Real Traffic Hits

A server can look fast with 1 visitor. The real test is what happens under load — when 50, 100, or 250 people are all requesting pages at the same time. This is where shared hosting plans fall apart and VPS plans prove their value.

I use Loader.io to simulate concurrent traffic. The test ramps up gradually: 10 users, 25, 50, 100, 150, 200, 250. At each level, I measure:

TTFB degradation: How much slower does the server get under load? A well-provisioned server shows <15% degradation at 100 users. A bad shared host degrades 200%+.
Error rate: Does the server return 500 errors, 503 errors, or timeouts? Any error above 0% under 250 concurrent users is a red flag.
P95 response time: What's the speed for the slowest 5% of requests? This tells you the worst-case user experience during peak traffic.
Recovery time: After load drops, how quickly does TTFB return to baseline? Some hosts throttle for minutes after a traffic spike.

Why this matters for you: If you're building a business website, you will eventually have traffic spikes. A blog post goes viral on Reddit. A product launch drives 500 visitors in an hour. A Black Friday sale. Your host needs to handle these spikes without crashing, throttling, or throwing errors. Load testing tells you whether it will.

TTFB Degradation at 250 Concurrent Users
ScalaHosting (VPS)+10% degradation (31ms)
Kinsta (Managed WP)+18% degradation (73ms)
ChemiCloud (Shared)+85% degradation (176ms)
Cloudways (DO)+95% degradation (215ms)
Hostinger (Premium)+232% degradation (604ms)
Bluehost (Basic)CRASHED at 80 users
GoDaddy (Economy)CRASHED at 50 users

Look at Bluehost and GoDaddy — they didn't even survive the load test. The servers returned 503 errors before we hit 100 concurrent users. These are the same hosts that millions of people are using for their business websites. They literally cannot handle a moderately popular blog post.

ScalaHosting's VPS barely flinched — 10% degradation at 250 users. That's because you get dedicated CPU cores on a VPS. Nobody else's traffic affects your performance. On shared hosting, your site shares CPU with hundreds of other accounts, and when one of them spikes, everyone suffers.

The shared hosting lie: "Unlimited bandwidth" means nothing when your server crashes at 50 concurrent users. Bandwidth is a data transfer limit. Concurrency is a server processing limit. Every cheap shared host has the bandwidth. Almost none can handle the concurrency. This is the gap between what they sell and what they deliver.
+10%
Best Degradation (ScalaHosting)
+232%
Worst Surviving (Hostinger)
250
Max Concurrent Users Tested
Step 5 of 8

CPU PassMark Verification — Exposing the Hardware Gap

Hosting companies love to say "powerful servers" and "high-performance infrastructure" without telling you what CPU you're actually running on. I find out. Here's how:

On VPS/Cloud hosts (where I have SSH access): I run cat /proc/cpuinfo to get the exact CPU model. Then I look up that model on PassMark's CPU Benchmark database which ranks 1,190 server-class CPUs by performance.

On shared hosts (where SSH is limited): I check the host's blog, support documentation, and help articles for CPU model mentions. If that fails, I contact support directly and ask. Some hosts will tell you. Others won't — which tells you something too.

Why CPU model matters: The difference between a 2016 Intel Xeon E5-2650 and a 2024 AMD EPYC 9474F is massive — 10x+ in single-thread performance. Hosting companies that use older CPUs can offer lower prices because the hardware is depreciated. But your WordPress site executes PHP on a single thread, so single-thread CPU performance directly determines your page generation speed.

CPU PassMark Rankings — Actual Hardware We Found
ScalaHostingAMD EPYC 9474F — #31 / 1,190 (102,432 pts)
ContaboAMD EPYC 7443P — #72 / 1,190 (59,125 pts)
Cloudways (Vultr)AMD EPYC 7402 — #156 / 1,190 (38,520 pts)
KinstaGoogle Cloud C2D — varies by region
HostingerMix of AMD EPYC and older Intel — inconsistent across accounts
Rocket.netIntel Xeon E-2288G — #465 / 1,190 (15,831 pts)
BluehostIntel Xeon E5-2650 v4 — #650+ / 1,190 (~8,200 pts)

ScalaHosting's AMD EPYC 9474F at rank #31 has a PassMark score of 102,432 points. Bluehost's Intel Xeon E5-2650 v4 scores approximately 8,200 points. That's a 12.5x performance gap in raw CPU power. And yet both are sold as "hosting" at similar price points.

This is why I rank CPUs in every review. You can't compare hosting plans without knowing what hardware you're running on. A $3/mo shared plan on a 2024 EPYC processor will outperform a $10/mo plan on a recycled 2016 Xeon — and you'd never know without checking.

The "CPU cores" trick: Some VPS hosts advertise "8 CPU cores" without telling you what CPU model those cores come from. 8 cores of an AMD EPYC 9474F is a beast. 8 cores of an Intel Xeon E5-2640 from 2012 is barely faster than a modern dual-core. Always check the specific model, not just the core count.
#31
ScalaHosting CPU Rank / 1,190
102K
Top PassMark Score (EPYC 9474F)
12.5x
Gap Between Best & Worst CPU
Step 6 of 8

12-Month Uptime Monitoring

Speed doesn't matter if your site is down. I monitor every host continuously for a minimum of 12 months using third-party uptime monitoring services. Every outage is logged with the exact duration, time, and whether the host acknowledged it.

Why 12 months? Because any host can have a good week. Even GoDaddy can show 100% uptime for 30 days. The patterns that matter — recurring outages, maintenance window issues, degraded performance during peak hours — only emerge over months of continuous monitoring.

How I monitor:

Check interval: Every 60 seconds from multiple monitoring nodes
Monitoring type: HTTP response check (not just ping — verifies actual page delivery)
Alert threshold: Any response code other than 200, or response time exceeding 10 seconds
Logging: Every incident logged with start time, end time, duration, and response code
Verification: When a downtime alert fires, I manually verify to eliminate false positives
The 99.9% uptime myth: Every host "guarantees" 99.9% uptime. Let's do the math. 99.9% uptime allows 8.7 hours of downtime per year. That's one full work day where your site is offline and your customers can't reach you. And the "guarantee" is usually a service credit — not a refund. If your ecommerce store loses $1,000/hour during an outage, and the host gives you a $2.00 credit, are you satisfied? The SLA exists to protect the host, not you.
99.993%
Best Uptime (ScalaHosting)
12 mo
Minimum Monitoring Period
60 sec
Check Interval

What I've learned from uptime monitoring: Most reputable hosts deliver 99.95%+ uptime consistently. The real differences emerge in how they handle incidents — scheduled maintenance communication, incident response time, and transparency about root causes. ScalaHosting has been the most transparent: they publish incident reports and proactively notify clients. Some hosts (I won't name names... actually, yes I will — GoDaddy and Bluehost) don't acknowledge outages at all unless you open a support ticket first.

Step 7 of 8

Renewal Price & TOS Deep-Dive (Where the Scams Live)

This is the step that most reviewers skip entirely — and it's the one that saves you the most money. The intro price is marketing. The renewal price is what you actually pay. And the Terms of Service is where the fine print hides the real resource limits, suspension policies, and billing traps.

For every host, I document:

Intro price vs. renewal price for every plan tier (the renewal increase ranges from 0% to 602% depending on the host)
Contract lock-in: What's the minimum billing period? Monthly? 12 months? 36 months? Most intro prices require 36-month prepayment
CPU/RAM limits from TOS/AUP: The real resource caps hidden in the "Fair Use" or "Acceptable Use" section
Inode limits: The file count limit that effectively caps your "unlimited" storage
Money-back guarantee conditions: What's excluded? Domain fees? Setup fees? Add-on services?
Cancellation process: How easy (or hard) is it to actually cancel and get your refund?
Hidden upsells: What's auto-checked during checkout? Domain privacy? Codeguard? SiteLock?
The Renewal Trap — Real Numbers
ScalaHosting Mini$2.95 → $4.95/mo (+68%)
ChemiCloud Starter$2.49 → $4.95/mo (+99%)
Hostinger Premium$2.99 → $7.99/mo (+167%)
A2 Hosting Startup$1.99 → $12.99/mo (+553%)
Bluehost Basic$2.95 → $11.99/mo (+306%)
SiteGround StartUp$2.99 → $17.99/mo (+502%)
HostPapa Starter$2.95 → $12.99/mo (+340%)

See the difference? ScalaHosting's renewal is $4.95 — a 68% increase. That's manageable. SiteGround's renewal is $17.99 — a 502% increase. If you signed up for 36 months at $2.99/mo and forgot to cancel before renewal, your next year costs $215.88 instead of $35.88. That's not a price increase — that's a completely different product at 6x the cost.

The TOS story that made me angry: In 2023, I was reviewing a popular shared host (I've since published the full review). Their marketing said "unlimited storage." Their TOS, Section 11.2, said: "Customer storage shall not exceed 250,000 inodes." An inode is roughly one file. A typical WordPress install with plugins and a few hundred posts uses 50,000-100,000 inodes. So "unlimited storage" actually means "you can't have more than a moderately complex WordPress site." I showed this exact clause in the review. The host's affiliate manager emailed me asking me to remove it. I didn't.

Why I read the full TOS: Because the marketing page says "unlimited" and the TOS page says "25% CPU, 200,000 inodes, 1GB RAM, 20 entry processes." The TOS is the legal document. The marketing page is an advertisement. I quote the TOS.

$4.95
Lowest Renewal (ScalaHosting)
$17.99
Highest Renewal (SiteGround, +502%)
602%
Worst Renewal Markup Found
Step 8 of 8

TrustPilot, Reddit & Complaint Ecosystem Research

Speed tests and pricing analysis tell you what the product does. Complaint research tells you what happens when things go wrong — which they will, eventually, with every host. I want to know: When customers have problems, does the host fix them or fight them?

TrustPilot research method:

Total review count and overall star rating
1-star and 2-star review count (the actual complaint volume)
Keyword-filtered searches for specific complaint types: "suspend" · "slow" · "billing" · "refund" · "cancel" · "downtime" · "scam" · "phishing" · "cpu" · "migration"
Pattern detection: if 50+ reviews mention the same issue (e.g., "account suspended for CPU"), it's systemic
Response rate: does the host respond to complaints, and are the responses helpful or copy-paste?

Reddit research method:

r/webhosting — the main web hosting discussion subreddit
Provider-specific subreddits (r/Cloudways, r/SiteGround, etc.)
r/WordPress for hosting-related threads
Pattern analysis: if 15+ threads report the same issue, it's not bad luck — it's a product problem
Extreme anecdotes: data loss, billing disputes over $500+, multi-day outages
Example — Bluehost complaint pattern: On TrustPilot, Bluehost has 4,200+ 1-star reviews as of early 2026. Filtering by "suspend" returns 300+ results. Filtering by "refund" returns 400+ results. The pattern: customer signs up, site gets suspended for "resource usage," customer tries to cancel, host makes it difficult to get a refund. This isn't anecdotal — it's hundreds of independent people reporting the same experience. That data goes directly into my Bluehost review.

Why complaint research matters: A host with 100ms TTFB and great benchmarks can still ruin your business if their support is terrible, their billing is predatory, or they suspend accounts without warning. I've seen hosts with excellent technical performance but abysmal customer treatment. My reviews include both dimensions — because you deserve to know what happens when something breaks.

Ownership research: I also check who owns the hosting company. Private equity acquisitions are a reliable predictor of quality decline. When Newfold Digital acquired Bluehost and HostGator, support quality dropped and prices rose. When DigitalOcean acquired Cloudways, the same pattern began. I track ownership changes via Crunchbase, Wikipedia, and SEC filings because they affect your experience 12-24 months after the acquisition — long after the "everything stays the same" press release.

The Full Picture

How All 8 Steps Come Together Into a Ranking

After completing all 8 steps for a host, I have a comprehensive profile: raw server speed, load resilience, CPU power, uptime reliability, true pricing, TOS fine print, and real user complaints. Now I rank.

My ranking factors, weighted by importance:

30%
Server Performance
TTFB, load test results, CPU PassMark ranking. The core technical capability.
25%
True Value (Price/Performance)
Renewal price, not intro price. What you actually pay per ms of TTFB performance.
15%
Uptime & Reliability
12-month uptime record, incident transparency, recovery speed.
15%
TOS Honesty & Limits
Marketing vs. reality gap. Renewal markup. Hidden fees. Resource limit transparency.
10%
Customer Treatment
TrustPilot/Reddit complaint patterns, support quality, cancellation difficulty.
5%
Features & Ecosystem
Control panel, free SSL, backups, migration, staging, email, CDN inclusion.

Why performance is weighted highest (30%): Because it's the one thing you can't fix yourself. You can add a CDN. You can optimize your images. You can install a caching plugin. But you can't make a slow server fast. Server performance is the foundation — everything else is built on top of it.

Why features are weighted lowest (5%): Because features are commoditized. Every host offers free SSL, one-click WordPress, and automated backups. These are table stakes in 2026, not differentiators. I won't rank a host higher because they include "free domain" when their server is 3x slower than the competition.

Important transparency note: These weights are my editorial judgment, not an algorithm. I don't plug numbers into a formula and publish whatever score comes out. I use the data to inform my ranking, then apply judgment based on 10+ years of experience — including managing real network infrastructure at the ISP level, where I dealt with traffic shaping, load balancing, and server capacity firsthand. If two hosts have similar benchmarks but one has a terrible TrustPilot record, I'll rank the cleaner host higher even if the math says they're equal. Data informs the ranking. Experience finalizes it.
What We Don't Do

Things You'll Never See on ThatMy.com

Transparency isn't just about what I do — it's about what I refuse to do. Here are the practices you'll never find on this site:

Paid reviews. No hosting company has ever paid me to write a review, or been given editorial control over any page. If they offered, I'd decline. And then probably write a critical review out of spite.
Vendor-provided test accounts. I buy every plan at retail price. No free accounts, no "press" access, no demo environments with boosted performance.
Rankings based on commission rates. Bluehost pays $150+ per signup. I don't recommend them. ScalaHosting pays less. I recommend them anyway. Commission rates don't affect rankings. Period.
Fake testimonials or fabricated data. Every benchmark number on ThatMy.com comes from a real test I ran on a real server I paid for. If I haven't tested something, I'll say "I haven't tested this yet" rather than make up numbers.
Removing negative information at a host's request. I've received emails from affiliate managers asking me to soften negative reviews or remove TOS analysis. I haven't complied. Not once.
Ranking every host as "#1 Best For [Something]." If a host is bad, it gets a low ranking. If it's terrible, it goes on the avoid list. I'd rather have 5 good recommendations than 20 inflated ones.

I'd rather lose affiliate revenue by being honest than gain revenue by misleading the people who trust me. That's not just ethics — it's good business. Readers who trust you come back. Readers you've deceived don't.

— Mangesh Supe, Founder of ThatMy.com
Ongoing Process

Testing Never Stops — Our Ongoing Monitoring Process

My testing isn't a one-time event. Hosting companies change — they upgrade servers, raise prices, get acquired, change TOS policies, switch data centers. A review from 2023 can be dangerously outdated in 2026. Here's how I keep everything current:

Quarterly re-testing: Every active host gets re-benchmarked at least once per quarter. I re-run TTFB tests, check for CPU upgrades/downgrades, and verify current pricing. If a host has changed significantly, I update the review within two weeks.

Real-time uptime monitoring: Uptime checks run 24/7 on every active host. I get alerts within 60 seconds of any downtime event. This data continuously feeds into my uptime assessments.

Pricing surveillance: Hosting companies change prices without announcing it. I manually check pricing pages monthly and compare against my records. When I find a change, I update every page that references that host's pricing.

Ownership & industry tracking: I monitor hosting industry news for acquisitions, leadership changes, and infrastructure updates. When DigitalOcean acquired Cloudways, I re-evaluated and updated my Cloudways recommendation within a month — including projections about likely price increases that turned out to be accurate.

TrustPilot & Reddit monitoring: I check complaint ecosystems monthly for each major host. A sudden spike in 1-star reviews about a specific issue (e.g., "account suspended," "billing dispute") triggers an immediate review update.

The freshness commitment: Every review page on ThatMy.com shows a "Last Updated" date. If that date is more than 6 months old, the data might be stale and I'll note that explicitly. My goal is to update every review at least twice per year, with critical pages (top recommendations, comparison pages) updated quarterly.
Our Toolbox

The Exact Tools I Use (No Secret Sauce)

I'm not hiding my methodology behind proprietary tools. Here's every tool I use — most are free. You could replicate my tests yourself if you wanted to.

curl + Custom Scripts
TTFB Measurement
The most reliable way to measure Time To First Byte. curl -o /dev/null -s -w "TTFB: %{time_starttransfer}" with timing breakdowns. Free. Reproducible. No third-party interpretation layer.
Loader.io
Load Testing / Stress Testing
Cloud-based load testing tool. Simulates 50-250+ concurrent users. Measures response time distribution, error rates, and throughput. Free tier available for basic tests.
PassMark CPU Benchmark Database
CPU Performance Ranking
The industry standard for CPU benchmarking. Their database ranks 1,190 server-class CPUs. I cross-reference every CPU model I find with their rankings for objective comparison.
Third-Party Uptime Monitors
24/7 Uptime Tracking
HTTP-based monitoring with 60-second check intervals from multiple geographic locations. Alerts, incident logs, and uptime percentage calculation over rolling 12-month windows.
SSH + /proc/cpuinfo
CPU Model Identification
On VPS hosts with SSH access: cat /proc/cpuinfo reveals the exact CPU model. On shared hosts: support inquiries, blog posts, and documentation analysis.
Manual TOS/AUP Review
Fine Print Analysis
No tool for this one — just me, a cup of chai, and 40 pages of legalese. I extract CPU limits, inode caps, bandwidth policies, suspension conditions, and refund exclusions from every host's Terms of Service.

None of these tools are proprietary or expensive. The "secret" to ThatMy.com's methodology isn't special software — it's the willingness to actually spend the time and money to do the testing. Most review sites don't test because testing takes months and costs thousands of dollars. Writing a fake review from the marketing page takes 30 minutes and costs nothing.

Common Questions

Questions About Our Testing (Answered Honestly)

"Do you earn affiliate commissions?"

Yes. When you click an affiliate link on ThatMy.com and sign up for hosting, I earn a commission. This is disclosed on every page. Affiliate commissions are how I fund the $200+/month in test hosting accounts and the thousands of hours of research. But — and this is the critical difference — I recommend hosts based on test data, not commission rates. Bluehost pays me more per signup than ScalaHosting. I recommend ScalaHosting anyway. The data decides.

"How do I know your benchmarks are real?"

You don't — and you shouldn't trust anyone blindly. What I can tell you: every benchmark number comes from a test I ran on a plan I purchased. I describe the exact methodology (curl timing, Loader.io parameters, PassMark lookups) so you can replicate the tests yourself. I'd love it if readers verified my numbers. If my data is wrong, tell me — I'll re-test and correct it publicly.

"Why don't you test more hosts?"

Money and time. Each host costs $50-$200 to test (plan purchase + 6 months of monitoring). I currently test 28+ hosts. I prioritize hosts that readers actually consider — which means the major shared hosts, popular managed WordPress platforms, and VPS providers that serve the most customers. If you want me to test a specific host, tell me and I'll add it to the list.

"Doesn't your recommendation of ScalaHosting make you biased?"

Fair question. Yes, I use ScalaHosting for ThatMy.com. Yes, I recommend them. Could I be subconsciously biased? Maybe. Here's my counter: I've switched my #1 recommendation three times in 10 years (SiteGround → Cloudways → ScalaHosting). Each time, I switched because the data told me to, not because of any financial relationship. I recommended SiteGround when they deserved it. I stopped when they didn't. I'll do the same with ScalaHosting if they ever stop earning the #1 spot. My track record shows I follow the data, even when it costs me affiliate revenue.

"Why do your rankings disagree with other review sites?"

Because other review sites don't test. They rank based on commission rates, brand recognition, and recycled opinions. When you actually buy the plans and run benchmarks, the results look very different from the marketing. Bluehost ranks #1 on sites that earn $150 per signup without testing. On ThatMy.com — where I've tested Bluehost and measured 480ms TTFB, 306% renewal markup, and 4,200+ TrustPilot complaints — it's on the avoid list. The data is clear. My rankings reflect the data.

"How often do you update your reviews?"

Target: every review updated at least twice per year. High-priority pages (top recommendations, competitive comparisons) are updated quarterly. If a host changes pricing, upgrades hardware, or has a major incident, I update the relevant pages within two weeks. Every page shows its "Last Updated" date.

"Can hosting companies pay to improve their ranking?"

No. Not for any amount of money. I've been asked. I've declined. If that ever changes, ThatMy.com will be shut down the same day, because the site's entire value is built on honest rankings.

See the Data in Action

Now that you know how we test, explore the results. Every ranking, review, and comparison below was produced using the methodology you just read.

Best Hosting 2026 Best VPS Hosting Fastest Hosting About the Author