NY Assemblyman Alex Bores on AI regulation: Need to ensure AI benefits the many, not just the few
Watch on YouTube ↗  |  February 12, 2026 at 15:27 UTC  |  7:29  |  CNBC
Speakers
Alex Bores — NY State Assembly Member

Summary

  • Assemblyman Bores is pushing to nationalize New York's "Raise Act," a regulatory framework for AI focused on safety, child protection, and preventing deepfakes.
  • He highlights significant opposition from major tech figures (OpenAI, Palantir, a16z) who have spent millions to block his legislation, suggesting the industry views his proposals as a material constraint.
  • Bores argues the market is currently "undervaluing trustworthiness," predicting that long-term winners will be models with verified safety rather than just raw speed.
  • He advocates for government-funded compute clusters (like NY's $400M investment) to democratize access, potentially creating a new source of demand for hardware.
Trade Ideas
Ticker Direction Speaker Thesis Time
LONG Alex Bores
NY State Assembly Member
Bores highlights New York State's $400 million investment to build its own "compute cluster" (Empire AI) and explicitly states, "We should be doing that same thing in investing in the capacity at the federal level." If Bores' plan for a federal version of the "Raise Act" includes the government purchasing its own compute capacity to expedite research and create a public option, the U.S. government becomes a massive direct buyer of AI hardware (GPUs). This adds a new, sovereign layer of demand for chipmakers (NVDA) beyond private sector capex. LONG hardware/infrastructure providers as government-sovereign AI spending ramps up. Legislative gridlock; federal budget constraints. 2:40
WATCH Alex Bores
NY State Assembly Member
Bores explicitly notes that a SuperPAC supported by "OpenAI President Greg Brockman, Palantir co-founder Joe Lonsdale... poured millions into opposing the legislation." When industry insiders (Palantir, OpenAI/Microsoft) spend millions to kill a specific bill, it signals that the legislation poses a material risk to their business models or operational freedom. Bores is now trying to take this exact legislation national. WATCH. If Bores gains traction in Congress, the specific regulatory risks that PLTR and OpenAI identified as threats could materialize federally. The legislation may fail to pass or be watered down; industry lobbying is powerful. 0:25
LONG Alex Bores
NY State Assembly Member
"The market actually currently is undervaluing trustworthiness of AI. The AI that will win in the long term will be trustworthy AI." Bores argues we are moving from a "move fast" phase to a "verification" phase. As regulations tighten, the "moat" shifts from raw compute speed to safety/compliance verification. This favors established players who can afford the "extensive checks" and audit layers over "wild west" startups. LONG companies building "Trustworthy AI" infrastructure and compliance layers. Over-regulation stifles innovation entirely; China/competitors ignore safety and win on capability.