What Should Boston's Mayor Wu Do About GAI? Custom Case Solution & Analysis

Evidence Brief: Case Extraction

Financial Metrics

  • Annual Budget: The City of Boston operates with a multi-billion dollar budget, though specific GAI-dedicated funding remains unallocated in the initial phase.
  • Headcount: Approximately 18,000 employees across various departments.
  • IT Infrastructure: Managed by the Department of Innovation and Technology (DoIT) with a focus on digital equity and cybersecurity.

Operational Facts

  • Interim Guidelines: Released in May 2023, allowing city employees to experiment with GAI while requiring disclosure and human oversight.
  • Usage Patterns: Employees use GAI for drafting emails, summarizing meetings, and coding assistance.
  • Geography: Boston, Massachusetts; includes diverse neighborhoods with varying levels of digital literacy and access.
  • Regulatory Context: Alignment with the Massachusetts Public Records Law and city-wide privacy policies.

Stakeholder Positions

  • Mayor Michelle Wu: Prioritizes equity, transparency, and using technology to improve city services without displacing workers.
  • Santiago Garces (CIO): Focuses on the technical feasibility, security risks, and the need for a structured framework for experimentation.
  • City Employees: Range from early adopters seeking efficiency to skeptics concerned about job security and accuracy.
  • Boston Residents: Concerned about data privacy, algorithmic bias, and the loss of human interaction in government services.

Information Gaps

  • Specific cost estimates for enterprise-grade GAI licenses versus open-source implementations.
  • Quantitative data on the current volume of unauthorized GAI use within city departments.
  • Detailed legal analysis of liability when GAI-generated content leads to administrative errors or bias.

Strategic Analysis

Core Strategic Question

  • How can the City of Boston integrate GAI to improve municipal efficiency while mitigating risks of algorithmic bias, data privacy breaches, and the erosion of public trust?

Structural Analysis

The PESTEL lens reveals that the social and legal factors are the primary drivers. Socially, Boston residents demand high levels of equity and accountability. Legally, the city must comply with strict public record and privacy laws. A Value Chain analysis indicates that GAI can significantly reduce time spent on administrative tasks (back-office) and improve citizen engagement (front-office), but only if the data inputs are clean and unbiased.

Strategic Options

Option Rationale Trade-offs Resources
Conservative Integration Focuses on internal back-office tasks only to minimize public-facing errors. Limits innovation speed; misses opportunity for direct citizen service improvement. Internal IT staff; existing software budgets.
Aggressive Pilot Program Rapidly deploys GAI in high-impact areas like 311 services and permit processing. High risk of public failure; potential for biased outcomes in service delivery. External consultants; new GAI enterprise licenses.
Community-Centric Framework Prioritizes open-source tools and public co-creation of AI policies. Slower implementation; requires significant public engagement effort. Public engagement teams; DoIT developers.

Preliminary Recommendation

Boston should pursue a Community-Centric Framework focused on internal efficiency first. This path allows the city to build technical competency and establish guardrails before scaling to public-facing applications. It aligns with Mayor Wu’s commitment to equity by ensuring that AI tools do not become black boxes that exclude or disadvantage specific populations.

Implementation Roadmap

Critical Path

  • Month 1: Establish a cross-departmental GAI Steering Committee led by the CIO.
  • Month 2: Finalize the GAI Use Policy, moving beyond interim guidelines to include specific prohibited use cases.
  • Month 3: Launch an internal sandbox environment for DoIT to test enterprise-grade LLMs.
  • Month 4-6: Execute three pilot projects in non-critical departments (e.g., internal knowledge management, drafting routine communications).

Key Constraints

  • Technical Debt: Legacy systems may not easily integrate with modern GAI APIs.
  • Talent Scarcity: High competition with the private sector for AI and data science expertise in the Boston area.
  • Data Integrity: Historical city data may contain biases that GAI models will amplify if not properly audited.

Risk-Adjusted Implementation Strategy

To manage operational friction, the city will implement a phased rollout. If a pilot project fails to meet accuracy benchmarks (e.g., 95% accuracy in summarization), the project will revert to manual processes until the model is refined. Contingency funds will be set aside for third-party algorithmic audits to ensure equity goals are met before any public-facing tool is deployed.

Executive Review and BLUF

BLUF

Boston must transition from passive guidelines to a controlled, internal-first GAI deployment. The primary objective is to capture administrative efficiencies while building the governance required to prevent algorithmic bias. Mayor Wu should authorize the creation of an Internal GAI Sandbox. This allows for experimentation without exposing the public to the risks of hallucination or data leakage. Success depends on rigorous human-in-the-loop protocols and a refusal to deploy public-facing AI until audit frameworks are verified. Speed must be secondary to safety and equity to maintain the administration’s core promise to the residents of Boston.

Dangerous Assumption

The most consequential unchallenged premise is that city employees will adhere to disclosure requirements. Without automated detection or strict technical controls, shadow AI use will likely continue, leading to unvetted data entering public records and potential legal liabilities.

Unaddressed Risks

  • Data Privacy Contamination: Probability: High. Consequence: Severe. Sensitive resident data could be inadvertently used to train commercial models if employees use non-enterprise versions of GAI tools.
  • Algorithmic Redlining: Probability: Moderate. Consequence: Severe. GAI tools used in resource allocation or 311 prioritization may inadvertently favor neighborhoods with higher digital engagement, deepening existing inequities.

Unconsidered Alternative

The analysis overlooked the option of a Public-Private Partnership with local universities (Harvard/MIT) to build a localized, open-source LLM specifically trained on Boston municipal data. This would solve for data sovereignty and ensure the model is tuned to the specific linguistic and demographic nuances of the city, rather than relying on generic commercial models.

Verdict

APPROVED FOR LEADERSHIP REVIEW


Weaver Network Technology: From Domestic Leader to Global Challenger custom case study solution

Foodora & Flash (A) Copycats Made in Germany custom case study solution

BatX: Battling the Recycling Curve custom case study solution

Patel Brothers: The Legacy and Challenges of a 50-Year-Old Retail Brand Serving the Indian Diaspora in the US custom case study solution

LI-NING: The "Chasing Dreams" Airport Show Controversy custom case study solution

Loma Vista Medical custom case study solution

Investing in Cannabis: Understanding the Accounting and Disclosures custom case study solution

Drizly: Managing Supply and Demand through Disruption custom case study solution

Project Destiny custom case study solution

Airinit: Clearing the Accounting Air custom case study solution

Signify Health: Building International Technology Development to Support Platform Scaling custom case study solution

athenahealth's More Disruption Please Program custom case study solution

Raising Capital at BzzAgent (A) custom case study solution

FINANCIAL STRATEGY AT BAA PLC (A) custom case study solution

Bayt.com: How Bayt.com Derived a "Place Surplus" in Dubai, U.A.E. custom case study solution