Our Standards

Editorial Policy

How ToolStackVault selects, tests, scores, and updates software reviews — and how we keep affiliate relationships from influencing our conclusions.

Last updated: March 2026 Questions? info@toolstackvault.com
Our commitment: Every review on ToolStackVault is based on hands-on testing by someone who has actually used the tool. We do not copy vendor marketing copy, publish “reviews” from affiliate description pages, or adjust scores based on commission rates. If a tool is mediocre, we say so — even if it has an affiliate program.

Why We Built ToolStackVault

Most SaaS review sites are built to rank affiliate links, not to help buyers. Their “reviews” are rehashed marketing descriptions padded to 3,000 words. Rankings are determined by commission rate, not product quality.

ToolStackVault was built to do the opposite. We cover four categories — Web Hosting, Email Marketing, SEO Tools, and AI Tools — and for each one, we invest in actual hands-on testing before publishing a word. Our goal is to become the site we wished existed when we were building our own software stack.

How We Select Tools to Review

We prioritize tools based on a combination of factors:

  • Market relevance — tools that a meaningful number of buyers are actually evaluating
  • Category coverage gaps — ensuring our pillars have complete, comparable coverage
  • Reader requests and search demand — what buyers are actively searching for
  • Significant product updates — when a major version change warrants a fresh look

We do not prioritize tools because they have affiliate programs, offer high commissions, or because a vendor contacts us. Vendor outreach does not accelerate or influence our review calendar.

Test Methodology by Category

Each category has a distinct testing framework based on what matters most for buyers in that space.

WordPress Hosting

  • Minimum 90-day live testing period
  • Automated TTFB checks every 5 min from 6 global locations
  • Uptime monitoring across full test period
  • Support tested with pre-written ticket scenarios
  • Performance benchmarked against identical test sites on competing hosts
  • Server stack, PHP version, caching setup documented

SEO Tools

  • Minimum 30-day hands-on testing
  • Keyword data accuracy cross-referenced against known baselines
  • Crawl quality assessed on real test sites
  • Workflow evaluation: onboarding, reporting, export formats
  • Feature comparison against at least 2 direct competitors
  • Pricing and plan limits tested at the tier we review

Email Marketing

  • Full account creation and onboarding evaluation
  • Automation workflows built and run end-to-end
  • Deliverability assessed via test campaigns to seed lists
  • Template editor and design tools tested hands-on
  • Segmentation and list management evaluated at scale
  • Support response quality and documentation assessed

AI Tools

  • Minimum 14-day hands-on testing period
  • Tested on real projects, not synthetic demos
  • Output quality assessed across multiple use cases
  • Workflow integrations and API access evaluated
  • Pricing transparency and usage limits tested
  • Compared against at least one direct alternative

Scoring Framework

Every reviewed tool receives a score from 1.0 to 10.0, derived from weighted assessments across five dimensions. Scores are not arbitrary — each dimension has a defined rubric and the rationale behind the final score is explained in the review body.

Performance & Speed
Measured where quantifiable (hosting uptime/TTFB, SEO crawl speed, AI response latency). For tools where performance is harder to measure, this dimension is weighted toward reliability and consistency.
Features & Functionality
Breadth, depth, and quality of the feature set relative to what buyers in this category actually need. We do not reward feature bloat — we assess whether features work well and solve real problems.
Ease of Use
How quickly a new user can achieve their goal. Includes onboarding, UI clarity, documentation quality, and the day-to-day workflow experience. Not just “pretty UI” — we weigh actual task completion friction.
Pricing & Value
Cost-effectiveness at the tested tier. We consider hidden costs, overage pricing, contract terms, and what you actually get relative to alternatives at a similar price point.
Support & Reliability
Quality and responsiveness of customer support (tested directly). Documentation completeness. Service reliability over the test period. Vendor transparency about incidents and downtime.

The overall score is a weighted composite. Hosting reviews weight Performance and Reliability more heavily; AI tool reviews weight Features and Ease of Use more heavily. The exact weights are noted in the methodology box at the top of each review.

Content Update Policy

SaaS products change constantly. A review that was accurate six months ago may no longer reflect reality. Here is how we handle that:

  • Monthly sweeps — all published reviews are checked once a month for pricing, plan, and major feature changes
  • Pricing/feature patches — updated within 7 days of detection or reader report
  • Major re-tests — triggered when a product releases a significant new version, changes its pricing model materially, or receives a sustained change in market reputation
  • Last updated dates — all reviews display a “Last Updated” date so you can judge freshness yourself

We do not simply update dates without making changes to pass off old content as fresh. A date update reflects genuine content revision.

Corrections Policy

We make mistakes. When we do:

  • Factual errors (wrong pricing, incorrect feature claim) — corrected promptly, with an inline correction note on the affected article
  • Scoring changes — if a score changes significantly after re-testing, we document the original score, new score, and the reason for the change
  • Reader corrections — reviewed within 48 hours; if valid, we fix the content and credit the reader if they wish
  • Vendor corrections — vendors may submit factual corrections through our contact form. Factual corrections are acted on; requests to improve tone or remove criticism are declined

To report an error, email info@toolstackvault.com with the subject “Correction” and the URL of the affected page.

Affiliate Relationships & Editorial Independence

ToolStackVault participates in affiliate programs for some of the tools we review. Here is exactly how that works — and doesn’t work:

What our affiliate relationships do NOT do

  • Influence scores, rankings, or conclusions in any review
  • Determine which tools get reviewed or featured in best-of lists
  • Allow vendors to preview, approve, or request changes to reviews
  • Generate “paid review” content — we do not accept payment for coverage

What our affiliate relationships DO mean

  • If you click a link to an affiliated tool and purchase, we receive a commission
  • We disclose affiliate relationships on a dedicated Affiliate Disclosure page and in review headers
  • Affiliate commissions fund our testing subscriptions and publishing operations

We purchase test accounts with our own money or use vendor-provided free trials available to any user. We do not accept complimentary upgrades, exclusive preview access, or other incentives in exchange for favorable coverage.

How to Reach Us

If you have a question about our methodology, want to flag an error, or have concerns about our editorial independence, we want to hear from you.

Email: info@toolstackvault.com or use the contact form.

ToolStackVault is a small, independent publication. We don’t have a newsroom of 50 people — but we do have consistent standards and a genuine commitment to honest reviews. Hold us to them.