Lead: The Moment AI Met the Green Screen
Across busy IBM i shops, a quiet shock rippled as developers watched AI assistants generate usable RPG, CL, and DDS in minutes—code that compiled, ran, and even passed early tests without the usual handholding many expected to be required for legacy platforms once considered immune to such leaps. That speed thrilled management but raised a sharper question on the floor: could legacy delivery practices keep up without turning “faster” into “fragile,” especially when PDM menus and manual promotions still held the release keys.
At COMMON POWERUp, AI demos no longer felt like theater. IBM’s “Bob,” Anthropic’s Claude, and OpenAI’s ChatGPT showed they could draft refactors, create test scaffolds, and convert DDS to DDL with confidence. The spotlight, however, shifted to the delivery backbone. “The code isn’t the blocker anymore,” an IBM i architect remarked. “The pipeline is.”
Nut Graph: Why This Shift Mattered
AI reached practical utility on IBM i just as many teams continued to treat the host as the single source of truth through source physical files and member-based edits. That tension made the gains precarious. Local development—where AI pair programming thrives—often collided with environments that mirrored code to Git nightly, treated pull requests as ceremony, and promoted via scripts maintained by folklore. The consequences were immediate. Change volume surged, but reviews did not scale. Security checks, SQL performance tuning, and boundary tests slipped through manual nets. Without automated gates and auditable pipelines, the risk profile tilted in the wrong direction. As one operations lead put it, “AI amplified what was already brittle.”
Inside the Workflow Clash
Veteran developers prized library discipline and years of muscle memory in PDM, RDi’s Remote System Explorer, and Code for IBM i. New hires expected local clones, branches, and PRs backed by CI. The culture gap widened when AI entered the room. “AI made branching nonnegotiable,” a team lead said. “Otherwise, how do you isolate and reason about thousands of generated lines?” Git’s role became the fault line. In many shops, repositories served as backups, not the system of record. Symptoms were easy to spot: nightly exports, minimal branching, and releases triggered by emails or spreadsheets. In contrast, Git-native teams ran feature branches, enforced PR reviews, and kept rollback targeted and fast. The difference showed up in audits, too—pipelines produced a trail; ad hoc scripts produced questions.
The Pivot: From Host-Centric To Git-Native
Eradani argued that the path forward began on developer machines. Local clones unlocked AI-assisted iteration, quick tests, and safe experiments. Standardized sync and packaging then moved changes to IBM i for builds and deploys, preserving the platform’s strengths while freeing teams from host-only editing. “Local-first with IBM i-aware automation changed the conversation,” an engineer noted. “We stopped debating tools and started debating code.”
The second move was governance. CI/CD pipelines added static analysis with tools like SonarQube, syntax checks, unit and integration tests, and environment-specific approvals. Policy gates enforced review coverage and quality thresholds regardless of authoring source—human or AI. Eradani’s customers reported fewer manual promotion steps, higher review participation, and faster, commit-linked rollbacks. “It wasn’t about distrusting AI,” a security lead said. “It was about trusting the process.”
Field Notes: Voices, Data, and Turning Points
User groups shared that AI experiments clustered around modernization work—SQL conversions, service wrappers, and refactors that had languished on backlogs. Early adopters cited notable time-to-code gains, especially for repetitive patterns and test scaffolds. Yet the most significant wins surfaced after pipelines matured. “The day we turned on branch protections, our defect escape rate dropped within a sprint,” one manager said.
Blended environments proved workable when tools respected existing structures. Teams kept PDM and RDi for certain edits while contributing through branches with shared build scripts and the same deploy engine. Integrations with GitHub, GitLab, Azure DevOps, or Bitbucket connected to Jira or ServiceNow for change control, and releases became artifacts with owners, not events with mysteries. Eradani’s iDeploy surfaced as a bridge for IBM i-aware releases, keeping audits clean and rollback precise.
The Roadmap: From Inventory To AI at Scale
Progress followed a phased cadence. First, teams mapped libraries, source files, members, dependencies, and artifact flows, then defined “done” and rollback criteria. Next, Git became the system of record—one repository per application, branching standards, required PRs, and named code owners. Automation followed: static analysis, tests, and environment-specific deploy stages with approvals and logs. Only then did AI move from experiments to policy-governed branches, with metadata tagging AI-authored commits for traceability.
Training sealed the shift. Short workshops on branching etiquette, PR reviews, and rollback drills gave veterans confidence and helped new hires understand IBM i constraints. Shared templates—for apps, pipelines, and release notes—created consistent outcomes across different authoring tools. “Parity was the point,” a delivery director said. “No matter where code started, it ended up accountable the same way.”
Conclusion: The New Standard Took Shape
As AI matured on IBM i, the decisive factor had been delivery discipline rather than model wizardry, and teams that advanced from host-centric practices to Git-native pipelines gained speed without trading away safety. The clearest next steps had involved establishing Git as the system of record, enforcing branch protections and PR reviews, automating analysis and tests, and channeling AI changes through the same governed path. With that foundation in place, modernization accelerated, audits quieted, and releases moved from anxious events to routine operations. At POWERUp, the takeaway was settled: AI coding on IBM i worked best when pipelines did the heavy lifting and every change told its own story from commit to production.
