City officials outline inventory, oversight plans and staff guidance in first Philadelphia AI hearing

Committee on Technology and Information Services, Philadelphia City Council · October 24, 2025

Loading...

AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

City of Philadelphia officials told the City Council’s technology committee on Oct. 31 that the city already uses both narrow machine‑learning systems and licensed generative AI features, and that it is building staff guidance, procurement safeguards and a cross‑departmental AI governance committee.

City of Philadelphia officials told the Council’s Committee on Technology and Information Services on Oct. 31 that the city already uses both narrow machine‑learning systems and licensed generative AI features, and that it is building staff guidance, procurement safeguards and a governance committee to oversee future uses.

Kristen Gray, chief legal counsel to Mayor Cherelle Parker and director of PhillyStat360, and Melissa Scott, chief information officer for the Office of Innovation and Technology, testified that the administration is guided by three principles — human judgment, public trust and accountability, and responsible use of smart tools — and that staff guidelines for generative AI will be published in plain language for employees and the public in late winter or early spring.

Gray said the rules are meant to ensure that “powerful tools must be used responsibly or we risk deepening inequalities and weakening trust with the public.” Scott told the committee the city treats machine‑learning systems (used for tasks such as anomaly detection, 311 routing and cybersecurity monitoring) differently from generative AI (tools that create new text, images or code on demand). She described two lists: one of narrow, task‑specific machine‑learning applications already embedded in city operations, and another of commercial software packages that include generative features if a department turns them on.

Scott gave examples of licensed commercial tools the city has approved or evaluated: GitHub Copilot and Microsoft 365 Copilot for code and document drafting; Adobe Firefly for image editing; Qualtrics and Tableau for natural‑language summarization and analytics; Salesforce’s Einstein GPT for CRM tasks; ServiceNow for ticket summarization; Zoom for meeting summaries; and design tools such as Miro, Canva and Figma. For cybersecurity and analytics she cited Splunk, SOC Radar and RunZero.

The city said it uses a standing procurement review called Project Health Check for any software purchase. Scott said departments must answer a 13‑question intake (accessibility, records management, transparency, accountability and security) and that some contracts include ongoing monitoring clauses: ‘‘For Copilot we pull monthly chat reports and prompt activity to monitor for sensitive data and inappropriate use,’’ she said.

Council members repeatedly asked which departments have independent technology units and whether deconsolidated units (for example, Recreation) are required to run purchases through OIT. Scott said oversight varies by department and agreed to provide a list of deconsolidated units. She also said that OIT, in partnership with the law department, Office of Human Resources and the chief administrative officer, reviews vendor risk and helps design contract language that would require notification when vendors enable new or changed AI functionality.

The administration said it is not aware of any use of generative AI within public‑safety systems but did not rule out historical or separate procurement paths and promised to follow up with the police department. The license‑plate‑reader vendor named on the record was Johnson Controls Inc., the administration said.

The administration repeatedly emphasized continuous policy development; Scott said an AI governance committee will “continuously review and refine our policies, track evolving technology, work with experts, [and] ensure that the city's use of artificial intelligence remains aligned with our values.” Gray and Scott agreed that transparency to the public is a goal and said training for employees will accompany any rollout of generative tools.

Members of the committee pressed the administration about contract enforcement if vendors change underlying models or add AI features after signing. Scott described risk assessments, contractual requirements for monitoring and reporting, clauses about output errors and bias mitigation, and background monitoring tools that can flag unapproved activations. The administration said it will include the law department in governance to address privacy, civil‑rights and constitutional issues.

The committee did not vote on Resolution 240759 during the hearing; the hearing was recorded as a public hearing on that resolution and will continue as part of the committee’s follow‑up process.

The administration provided to the committee an initial inventory of commercial services that include generative features and a longer list of machine‑learning systems in use for cybersecurity, performance monitoring and administrative automation. It recommended continuing external consultation with vendors, academic partners and peer cities while developing citywide practices.

The committee will hold additional hearings and the chair said she expects to involve community voices and outside experts as the City drafts policies and procurement language.