AI Accountability Act: What Small Businesses Must Know

AI Accountability Act: What Small Businesses Must Know

March 30, 2026 · Martin Bowling

AI bias audits are no longer optional

The federal AI Accountability Act (H.R. 1694) is moving through Congress, and it is not alone. Colorado’s AI Act takes effect June 30. Illinois already requires AI hiring disclosures. New York City mandates annual bias audits for any automated hiring tool. And the White House just released a national AI framework that proposes risk-tiered compliance for every business using AI.

If your business uses AI for hiring screens, loan decisions, customer intake, or scheduling, these rules apply to you. Small businesses are not exempt. A 10-person company using an AI hiring tool in New York faces the same audit requirement as a Fortune 500 firm.

What is actually happening

The federal push

The AI Accountability Act, introduced by Rep. Josh Harder (D-CA), directs the National Telecommunications and Information Administration to study and recommend accountability measures — audits, assessments, certifications — for AI systems. It has not been signed into law yet, but it signals where federal policy is heading.

More immediately, the White House released its National Policy Framework for Artificial Intelligence on March 20, 2026. The framework introduces risk tiers: AI tools used for internal document drafting face light disclosure rules, while tools that screen job applicants or approve loans face mandatory audits and human oversight.

The framework also proposes federal preemption of state AI laws — meaning Congress could eventually replace the current patchwork of state rules with a single national standard.

State laws that are already real

Federal legislation takes time. State laws do not wait.

Colorado SB 24-205 takes effect June 30, 2026. If you deploy a high-risk AI system — one that influences decisions about employment, lending, housing, healthcare, or education — you must conduct an impact assessment within 90 days and repeat it annually. You need a documented risk management policy. Violations count as unfair trade practices under Colorado consumer protection law.

Illinois HB 3773 took effect January 1, 2026. It prohibits AI that discriminates against employees based on protected characteristics and requires employers to notify workers when AI influences employment decisions.

NYC Local Law 144 has been in effect since 2023. Any employer using an automated employment decision tool must complete an annual independent bias audit and post the results publicly. Candidates must be notified at least 10 days before such a tool is used on them.

Why this matters for your business

You are probably already using affected tools

If you use AI-powered hiring platforms, automated loan underwriting, resume screening, or customer credit scoring, these laws likely apply to you. The definition of “high-risk” is broad. Colorado’s law covers any AI system that “substantially contributes” to a consequential decision — and “substantially contributes” is defined loosely enough to catch many common business tools.

Outsourcing does not shift responsibility

Using a third-party AI tool does not protect you. Under Colorado’s law and NYC’s Local Law 144, the business deploying the tool — not just the company that built it — bears compliance responsibility. If your AI vendor’s hiring tool discriminates, you face the enforcement action.

Costs are real but manageable

Independent bias audits range from $5,000 to $50,000 depending on the complexity of the system. Annual impact assessments add administrative overhead. But the alternative — an enforcement action or discrimination lawsuit — costs far more.

What you should do now

1. Inventory your AI tools

List every AI-powered tool your business uses. For each one, determine whether it influences decisions about hiring, lending, scheduling, pricing, or customer access. Those are the tools that trigger compliance obligations.

2. Ask your vendors hard questions

Contact each AI vendor and ask:

  • Has this tool been independently audited for bias?
  • Can you provide documentation on training data and algorithmic fairness testing?
  • What risk tier does this tool fall under in the White House framework?
  • Will you provide contractual guarantees of compliance?

If a vendor cannot answer these questions, that is a red flag.

3. Document everything

Start a compliance file. Record which tools you use, what decisions they influence, when they were last audited, and what your risk management policy is. Colorado requires you to maintain this documentation. Even if your state does not require it yet, having records protects you.

4. Watch the June 30 deadline

If you operate in Colorado or serve Colorado customers, the clock is ticking. Impact assessments must be completed within 90 days of the law taking effect. That means you need to start now, not in June.

5. Monitor federal preemption

The White House framework could eventually simplify compliance by replacing state-by-state rules with a single federal standard. But “could eventually” is not a compliance strategy. Plan for the rules that exist today. We covered the federal preemption signals in detail in our analysis of the March 11 deadlines.

The bottom line

AI accountability is not a future problem. State laws are creating real obligations right now, and federal legislation is building momentum. Small businesses that get ahead of compliance — by auditing their tools, questioning their vendors, and documenting their processes — will spend less and worry less than those who wait for enforcement to find them.

If you need help evaluating which AI tools meet compliance standards or building a governance process that fits a small business budget, reach out to our consulting team. We work with Appalachian businesses to adopt AI responsibly — tools that perform well and hold up to scrutiny.

Small Business Industry News AI Tools Automation