Back to Guides
SSH

SSH Host Certificates

Eliminate "trust this fingerprint?" forever

10 min readIntermediate
SSH Host Certificates: Eliminate trust this fingerprint forever
TL;DR for SREs & Platform Engineers
  • What they do: Let servers prove identity via CA signature—no more fingerprint prompts
  • Key benefit: Zero TOFU risk, zero known_hosts management, works with automation
  • Low risk: Deploy incrementally—servers offer certs, clients verify when configured

1. The TOFU Problem

Every SSH user has seen this:

The authenticity of host 'server01.example.com (10.0.1.50)' can't be established.
ED25519 key fingerprint is SHA256:xK3rV9mPqL2nB8wY5tH7jU4cF6gA1dS0zX9oN3iE2qM.
Are you sure you want to continue connecting (yes/no/[fingerprint])?

Be honest: When was the last time you actually verified that fingerprint?

This is TOFU - Trust On First Use. The security model is:

  1. First connection: blindly trust whatever server responds
  2. Future connections: warn if the key changes

The problems with TOFU:

Man-in-the-middle on first connection

If an attacker intercepts your first connection to a new server, you'll happily accept their key. You've now trusted the attacker.

Users trained to type "yes" blindly

Nobody calls the server admin to verify fingerprints. We've trained ourselves to bypass the security check.

known_hosts sprawl

Your ~/.ssh/known_hosts file grows with every server. New laptop? Start from scratch. Server rebuilt? "WARNING: REMOTE HOST IDENTIFICATION HAS CHANGED!" panic.

Automation nightmares

Scripts need StrictHostKeyChecking=no to work with new servers—effectively disabling the security check.

The irony: SSH's host key verification is solid security—if anyone actually used it properly. Host certificates make it automatic.

When host certificates might be overkill:

Very small, static environments (3-5 servers that rarely change) may not justify the CA setup overhead. If your team manually verifies fingerprints once and never rebuilds servers, traditional host keys work fine. But even modest growth or automation needs will tip the balance toward certificates.

2. How Host Certificates Work

The solution mirrors how TLS handles server identity:

┌─────────────┐     ┌─────────────┐     ┌─────────────┐
│   Client    │     │     CA      │     │   Server    │
│(trusts CA)  │     │  (signs)    │◀────│ (has cert)  │
└─────────────┘     └─────────────┘     └─────────────┘
       │                                       │
       │  1. Connect to server                 │
       │──────────────────────────────────────▶│
       │                                       │
       │  2. Server presents host certificate  │
       │◀──────────────────────────────────────│
       │                                       │
       │  3. Client verifies cert against CA   │
       │     ✓ Valid? Connect!                 │
       │     ✗ Invalid? Reject!                │

What's happening:

  1. Server's host key is signed by a CA → creates host certificate
  2. Client trusts the CA (configured once)
  3. On connection, client verifies the server's certificate
  4. Valid certificate from trusted CA = automatic trust
  5. No fingerprint prompt. No known_hosts entry needed.

Same concept as TLS: Your browser trusts certificate authorities, which vouch for websites. SSH host certificates work identically—your SSH client trusts a CA, which vouches for servers.

3. Anatomy of a Host Certificate

Host certificates contain fields that identify the server:

FieldPurposeExample
Key IDIdentifier for logsserver01.example.com-20250114
PrincipalsHostnames/IPs this cert is valid forserver01.example.com, server01, 10.0.1.50
Valid AfterCertificate start time2025-01-14T00:00:00
Valid BeforeCertificate expiration2025-07-14T00:00:00
TypeMust be host certificatehost key

Key difference from user certificates: The -h flag marks it as a host certificate. Principals are hostnames/IPs, not usernames.

→ See our SSH Certificate Anatomy demo to explore these fields interactively.

4. Signing a Host Certificate

Sign the server's existing host key:

ssh-keygen -s /path/to/host_ca \
  -I "server01.example.com-20250114" \
  -h \
  -n server01.example.com,server01,10.0.1.50 \
  -V +52w \
  /etc/ssh/ssh_host_ed25519_key.pub

Flag breakdown:

FlagMeaningNotes
-sCA private keyYour host CA (separate from user CA)
-IKey IDAppears in logs, include hostname + date
-hHost certificateCritical! Without this, it's a user cert
-nPrincipalsAll hostnames/IPs clients might use
-VValidityLonger is OK for hosts: weeks or months

Output: Creates /etc/ssh/ssh_host_ed25519_key-cert.pub

Important: Include ALL names clients might use:
  • • FQDN: server01.example.com
  • • Short name: server01
  • • IP address: 10.0.1.50
  • • Any aliases or CNAMEs

If a client connects using a name not in the principals list, verification fails.

5. Server Configuration

Tell sshd to present the certificate:

# /etc/ssh/sshd_config

# Present this certificate to connecting clients
HostCertificate /etc/ssh/ssh_host_ed25519_key-cert.pub

Restart sshd:

systemctl restart sshd

That's it on the server side. The certificate file sits alongside the existing host key. The private key doesn't change.

Multiple key types? Add multiple HostCertificate lines:

HostCertificate /etc/ssh/ssh_host_ed25519_key-cert.pub
HostCertificate /etc/ssh/ssh_host_rsa_key-cert.pub

6. Client Configuration

Clients need to trust the CA. Two options:

Option A: Per-user configuration

Add to ~/.ssh/known_hosts:

@cert-authority *.example.com ssh-ed25519 AAAA...
Option B: System-wide configuration

Add to /etc/ssh/ssh_known_hosts:

@cert-authority *.example.com ssh-ed25519 AAAA...

This says: "Trust any host certificate signed by this CA for hosts matching *.example.com"

Pattern matching:

PatternMatches
*.example.comAny subdomain of example.com
*.prod.example.comOnly prod subdomain hosts
10.0.1.*IP range (fragile with DHCP or dynamic cloud IPs)
*Everything (probably too broad)

Best practice: Scope the pattern to your infrastructure. Don't use * unless you control the CA tightly.

7. The Complete Picture: User + Host Certificates

With both certificate types deployed:

┌──────────────┐                      ┌──────────────┐
│    Client    │                      │    Server    │
│              │                      │              │
│ Trusts:      │                      │ Trusts:      │
│ • Host CA    │◀────── TLS-like ────▶│ • User CA    │
│              │      mutual trust    │              │
│ Has:         │                      │ Has:         │
│ • User cert  │                      │ • Host cert  │
└──────────────┘                      └──────────────┘
Client verifies server

"Is this host certificate signed by a CA I trust?"

Server verifies client

"Is this user certificate signed by a CA I trust?"

Zero TOFU. Zero authorized_keys. Zero fingerprint prompts.
Both sides authenticate cryptographically through trusted CAs.

Compliance note

SSH host certificates align with requirements in common security frameworks (NIST 800-53, SOC 2, PCI DSS)—centralized key management, documented trust chains, and auditable access control. Auditors appreciate the clear CA hierarchy and the elimination of ad-hoc host key acceptance.

8. Use Cases

Host certificates shine in these environments:

Multi-tenant bastion hosts

Users from different teams connect through shared jump hosts. Host certs eliminate fingerprint prompts across all bastions without per-user known_hosts management.

Ephemeral cloud infrastructure

Auto-scaling groups, spot instances, and Kubernetes nodes spin up/down constantly. Certificates signed at boot mean instant trust.

GitOps / CI pipelines

Deployment scripts SSH to targets without StrictHostKeyChecking=no. Real security instead of disabled security.

Server rebuilds and migrations

Rebuilding servers no longer triggers "HOST KEY HAS CHANGED" warnings. Same hostname + new cert from same CA = seamless trust.

9. Deployment Strategy

Rolling out host certificates is low-risk:

Phase 1: Sign host keys (no client changes)
  • • Sign existing host keys on all servers
  • • Add HostCertificate to sshd_config
  • • Servers now offer certificates, but clients don't require them
  • Zero impact—certificates are ignored if clients aren't configured
Phase 2: Distribute CA trust to clients
  • • Add @cert-authority to known_hosts (user or system-wide)
  • • Clients now verify host certificates when available
  • • Falls back to traditional host key for non-certified servers
  • • Users stop seeing fingerprint prompts for certified hosts
Phase 3: Clean up known_hosts (optional)
  • • Remove individual host entries from known_hosts
  • • Only the CA line remains
  • • Dramatically smaller known_hosts files

Timeline: Can be done in a day for small environments. Weeks for large enterprises with change management.

Who should own the host CA?

Typically SRE/platform teams own the host CA since they control server infrastructure. Keep the CA private key in a secure location (HSM, Vault, or air-gapped machine)—never on individual servers. This creates clear separation: platform team signs host certs, security team audits, app teams consume.

10. Validity Periods and Renewal

Host certificates can have longer validity than user certificates:

Certificate TypeTypical ValidityWhy
User certificatesHours to daysShort-lived = no revocation needed
Host certificatesWeeks to monthsServers are stable, renewal is operationally harder

Why short validity + automation beats revocation: SSH certificates don't have practical CRL/OCSP support like X.509. Rather than building complex revocation infrastructure, most organizations use shorter validity periods (weeks, not years) combined with automated renewal. If a host is compromised, the certificate expires naturally before manual revocation would propagate anyway.

Renewal approaches:

Manual renewal
  • • Calendar reminder before expiration
  • • Re-sign and deploy new certificate
  • • Fine for small environments
Automated renewal
  • • Cron job or configuration management
  • • Server requests new cert periodically
  • • Requires signing service (Vault, step-ca, Venafi)

Example: Renewal check script

#!/bin/bash
CERT="/etc/ssh/ssh_host_ed25519_key-cert.pub"
EXPIRY=$(ssh-keygen -L -f "$CERT" | grep "Valid:" | awk '{print $5}')
EXPIRY_EPOCH=$(date -d "$EXPIRY" +%s)
NOW_EPOCH=$(date +%s)
DAYS_LEFT=$(( (EXPIRY_EPOCH - NOW_EPOCH) / 86400 ))

if [ $DAYS_LEFT -lt 14 ]; then
    echo "Host certificate expires in $DAYS_LEFT days - renewal needed"
    # Trigger renewal process
fi

11. Troubleshooting

Common issues and solutions:

SymptomLikely CauseFix
Still getting fingerprint promptClient not configured to trust CACheck @cert-authority in known_hosts
name is not a listed principalHostname not in -n listRe-sign cert with all hostnames/IPs
Certificate invalid: expiredPast validity periodRe-sign with new validity
Cert not presentedsshd_config missing HostCertificateAdd directive and restart sshd
Works for some users, not othersPer-user vs system known_hostsCheck both locations

Debug from client side:

ssh -vvv user@server 2>&1 | grep -i "host cert"

Look for:

  • Server host certificate: ... - Server is presenting certificate
  • Host certificate: ... CA ... - Client is checking against CA

Verify certificate on server:

ssh-keygen -L -f /etc/ssh/ssh_host_ed25519_key-cert.pub

Check principals list and validity dates.

12. Host Certificates + Configuration Management

Deploying at scale with Ansible, Puppet, or similar:

Ansible example:

# Sign host key (run from CA machine)
- name: Sign host certificate
  command: >
    ssh-keygen -s /etc/ssh/host_ca
    -I "{{ inventory_hostname }}-{{ ansible_date_time.date }}"
    -h
    -n {{ inventory_hostname }},{{ ansible_hostname }},{{ ansible_default_ipv4.address }}
    -V +52w
    /etc/ssh/ssh_host_ed25519_key.pub
  delegate_to: ca_server

# Deploy certificate to host
- name: Deploy host certificate
  copy:
    src: "/etc/ssh/certs/{{ inventory_hostname }}-cert.pub"
    dest: /etc/ssh/ssh_host_ed25519_key-cert.pub
    mode: '0644'
  notify: restart sshd

# Configure sshd
- name: Add HostCertificate directive
  lineinfile:
    path: /etc/ssh/sshd_config
    line: "HostCertificate /etc/ssh/ssh_host_ed25519_key-cert.pub"
  notify: restart sshd

Key consideration: The CA private key should NOT be on every server. Either:

  • • Sign certificates on a central CA machine
  • • Use a signing service (Vault, Venafi) that servers request certs from

FAQ

Do I need to regenerate my server's host keys?

No. Host certificates sign your existing host public key. The private key stays exactly the same.

What happens if a server's certificate expires?

Clients configured to trust the CA will fall back to traditional host key verification—you'll see the fingerprint prompt again. Existing connections aren't affected.

Can I use the same CA for users and hosts?

Technically yes, but don't. Separate CAs mean: compromise of user CA doesn't let attackers create fake servers, compromise of host CA doesn't let attackers create user certificates, and clearer audit trail.

How do I handle servers with dynamic IPs?

Include the hostname in principals, not the IP. Or use broader patterns in your @cert-authority line. For truly ephemeral infrastructure, consider re-signing on boot.

What about jump hosts / bastion servers?

Jump hosts should have host certificates like any other server. The client verifies the jump host, then verifies the destination. Both use the same CA trust.

Does this work with ProxyJump?

Yes. Each hop is verified independently against the CA.

What about SSHFP DNS records?

SSHFP records publish host key fingerprints in DNS (protected by DNSSEC). They solve a similar problem but require DNSSEC deployment and DNS infrastructure changes. Host certificates are generally simpler operationally and don't depend on DNS security. Most organizations choose one approach; this guide focuses on certificates.

Related Guides

SSH Certificate Series
Related Concepts
Enterprise