Hackathon puts AI to the test in real engineering workflows

In their latest Hackathon, the Pixo team explored how AI—using Claude Code—can help streamline certain parts of engineering workflows.
Three Pixo team members gathered around a laptop and smiling, with text

Incorporating AI—strategically and selectively—into project-based workflows

Pixo’s engineering team holds regular hackathons to collaboratively test new tools, build skills, and explore ideas that could translate into real-world solutions. In their latest session, the team explored how AI—using Claude Code—can help streamline certain parts of engineering workflows.

In keeping with Pixo’s broader approach to AI, these efforts led with human judgment and expertise and introduced AI thoughtfully, as a tool—never a substitute.

Working in pairs, the team took on three distinct projects. Here are some highlights and insights:

Project #1: Using AI + MCP for UI implementation

In this project, the team explored how Model Context Protocol (MCP) could be used alongside AI tools to plan and execute the implementation of UI designs created in Figma and Figma Make.

Rather than replacing engineering work, the process shifted where effort was spent. The team had to rethink how they structured tasks for AI—breaking work into smaller, more discrete steps—and closely review outputs along the way. Accessibility issues, in particular, required manual validation and remediation, while images and icons proved more difficult for the AI to handle reliably.

As one team member noted:

“When [Claude] needed to iterate on things, it was iterating on them because we reviewed them and told it to iterate on them. So it wasn’t really ‘downtime’—it was human brain time.”

That dynamic led to a modified workflow—and a new flavor of challenge:

“It was a whole different kind of problem-solving than ‘normal’ engineering.”

Key takeaway: AI can accelerate parts of the process, but it doesn’t reduce the need for thoughtful oversight—especially where accessibility is concerned.

“Diligent reviewing of these things would be very important.”

Project #2: Spinning up a CRUD app with AI

In this project, the team set out to create a simple CRUD (create, read, update, delete) interface for managing structured editorial content—such as books with chapters or journals with articles—with export capabilities for ingestion into research databases.

Using AI to accelerate development introduced a new wrinkle to traditional pair programming. Instead of working side by side on the same task, the team had to rethink how to use the “downtime” while AI processed requests. That often meant dividing and conquering—working in parallel across different machines to keep momentum going, or using that time for secondary tasks and learning.

The result was a fully functional prototype, built remarkably quickly:

“All of this is backed by an actual database…and you’re able to spin this up from nothing in a day.”

At the same time, the experiment surfaced some clear tradeoffs—particularly around design. The team intentionally chose not to establish a design system, allowing the AI to generate UI patterns freely.

The outcome was instructive:

“We just let AI do whatever it felt like doing, which ended up being pretty ugly.”

Key takeaway: while AI can dramatically accelerate functional builds, thoughtful design direction remains essential to delivering a polished, user-friendly experience.

Project #3: AI-assisted content auditing and migration

In this project, the team explored how AI could support one of the most time-intensive parts of web projects: content auditing and migration.

They built an AI-powered scraper to generate a comprehensive content audit spreadsheet—capturing not just sitemap URLs, but also featured images and linked files. From there, the data was organized into structured headings that could map directly to fields in Drupal or WordPress, setting the stage for more automated importing.

To streamline the process further, the team created a set of reusable “skills”—one focused on preparing content for import, and another tailored to the needs of a content strategist conducting an audit. They also prompted the AI to document its own steps, allowing those workflows to be saved and reused—effectively using AI to help train future AI-assisted tasks.

Along the way, they encountered a key limitation: spreadsheets can’t store HTML content. To work around this, the team translated content into Markdown that could be processed by a Drupal importer.

Despite the technical hurdles, the value was clear:

“Overall I was very happy with what AI did, because that is super tedious work…The more we do it, the easier it gets.”

Key takeaway: AI is especially well-suited to reducing the burden of repetitive, soul-crushing detail-heavy work—freeing up teams to focus on higher-level strategy and decision-making.