Skip to content

Trust

Security & trust

Built for teams who have to explain AI to risk. This page is the human summary—your review and our contracts still rule the boring details.

How we think about safety

Security is a shared responsibility between FAi (hosted at finitybot.ai), your identity choices, and how you configure integrations.

Signed-in access
People use their own accounts to open FAi, so activity stays tied to real users and your workspace rules—not anonymous traffic.
Workspace isolation
Company membership and roles shape what each person can see, which reduces mix-ups when many teams share the same product.
Integration connections
Links to tools like Notion or Discord use credentials you control. Treat them like any other sensitive login, and rotate them on your schedule—see our Privacy Policy for how we handle data.
Protected paths
Traffic between you and our hosted service is encrypted in transit. Behind the scenes we keep a tight boundary between the chat you see and the systems that power it.
Defense in depth
We follow common SaaS practices: encryption, careful internal access, and room for your security team to review how FAi fits your program.

Responsible use

AI outputs can be incorrect or incomplete. Teams should validate high-stakes decisions independently. Configure integrations so the assistant only reaches data that policy allows.

For data processing details, retention, subprocessors, and legal terms, read the Privacy Policy and Terms of Service.