Advertisement

ATF moving all apps, services to the cloud by October

ATF's CTO detailed how the bureau got from "really ignoring" its IT to all-in on the cloud in a few years.
Lauren Marakas, senior special canine handler from the Bureau of Alcohol, Tobacco, Firearms, and Explosives, walks with Ruthie, an explosives detection canine, to search for explosives during a joint explosive detection training exercise. (U.S. Air Force photo by Senior Airman Brett Clashman)

The Bureau of Alcohol, Tobacco, Firearms and Explosives is nearing a milestone of closing its last remaining data center and moving all of its data and applications to the cloud.

By the end of the fiscal year, ATF wants to have all of its users transitioned over to a new commercial cloud environment and plans to close down the data center and repurpose it for office space, bureau CTO Mason McDaniel told FedScoop.

McDaniel detailed ATF’s transition to this point as a law enforcement agency with two vastly different IT environments. Prior to kicking off its modernization efforts in 2016, the bureau had been “really ignoring” its IT, McDaniel told FedScoop in a December interview before speaking at the Amazon Web Services re:Invent conference in Las Vegas.

The writing was on the wall after ATF shut down its disaster recovery data center in 2013 and moved remaining computing instances to the primary center. The bureau needed to make quick changes to its IT environment or risk losing critical functionality if something were to happen at that facility. “So we had no disaster recovery at all,” McDaniel said.

Advertisement

In 2016, such a crisis almost occurred when a huge snowfall forced the agency to evacuate the data center for two days for safety reasons, leaving it up to chance that systems would stay online running smoothly. “If they went offline, they would be down until we could get back in,” McDaniel said.

It was an eye-opening moment that forced McDaniel and other IT officials to communicate the risk of holding onto this technical debt, which he described as “not doing stuff” that the agency should have.

“The way you have to communicate to the executives who are largely not technical is trying to take whatever the technical debt is you’ve accumulated through not doing stuff in the past and communicate it in business terms,” he said. “Again if that data center goes away, then all ATF systems drop offline with no primary and failover and no estimate for times to restore.”

Investing in the ‘invisible’

McDaniel described the situation as a prime example of why IT management and modernization is so difficult in the federal government, and why technical leaders must learn to communicate in terms of business and mission.

Advertisement

“Invisible IT has to be invested in, even if it seems to be running for the users,” he said. “It’s really easy to understand how body armor, bullets, vehicles directly support the ATF mission. But it’s a lot harder to see why they should invest in IT when users are using the systems every day” seemingly without problems, especially when the bureau is struggling to keep the administration from trimming its budget.

But once IT leaders had amassed meaningful evidence of how neglecting IT could backfire in bigger ways, such as the data center snow day in 2016, “the executive buy-in was amazing,” McDaniel said.

Then the hard work started. After convincing leadership that the move to a commercial cloud was the right choice, McDaniel said, “we moved data from our development environment over, our test environment and we moved all production data for all ATF apps into AWS and got it fully security reviewed and accredited for live use before we ever tackled the first app. So we got all the operational data over there and then we started one-by-one going through and refactoring the apps.”

Cleaning up apps so that they functioned properly in the cloud proved more than challenging. For instance, “in one of our biggest applications, 80 percent of the source code didn’t work,” McDaniel explained. “It’s a whole lot more iterative and it’s a lot slower” than you would expect.

“There’s a lot going on — it’s amazing the dirt we discovered inside our systems that we just had no idea,” he said.

Advertisement

“The US Bureau of Alcohol, Tobacco, Firearms and Explosives is a great example of how thinking big can allow government agencies successfully manage the move to cloud, and retire decades of technical debt,” said Dave Levy, Vice President of U.S. Government at AWS. “We remain committed to supporting the ATF’s critical mission and for technology to continue driving positive mission outcomes.”

Governance matters

On top of the coding issues, the agency collided into a roadblock with the policy and governance that enable the technical work.

“The technology is all fantastic and this is hard as hell,” McDaniel said. “There’s nothing easy about it. But what’s taken a similar amount of effort is on the process and policy side … It doesn’t do you any good if you can build a virtual machine or container in five minutes if it takes a month to get it through your change control process.”

McDaniel’s team wrote a new governance framework from scratch to account for automated testing and deployment needs that weren’t in the ATF’s legacy IT environment. “We’re hoping that as we get all of the technology in place to do the automated deployments, the policies and processes will be there to actually support and enable those,” he said.

Advertisement

Now, though about “two years behind where we wanted to be,” McDaniel said, ATF is planning in the February and March timeframe to switch users over to about three-quarters of the application portfolio in the cloud. Then sometime this summer, the bureau will switch users over to remaining applications.

When the data center is taken offline and everything is in the cloud, users won’t necessarily notice a change, “aside from that the systems don’t go down every day” and general performance should improve, McDaniel said.

However, slowly but surely, the benefits will begin to stack for users. “What they’re going to see soon after that, once we finish this part, is a focus back on the actual processes themselves,” McDaniel said. “Many, many of the processes that out users and analysts and agents have to go through require them to go from system to system to system because of how we built things in projects over time. Every time something new was needed, the teams that developed the old ones were gone. And over time, we’ve got all the tiny little disconnected systems so the users have to manually go back and forth between them to do stuff.”

With applications seamlessly connected in the cloud, it will cut down by “half or more the amount of time it takes them to do a lot of their daily activities,” he said. “That’s when they’re really going to start seeing the benefits.”

Billy Mitchell

Written by Billy Mitchell

Billy Mitchell is Senior Vice President and Executive Editor of Scoop News Group's editorial brands. He oversees operations, strategy and growth of SNG's award-winning tech publications, FedScoop, StateScoop, CyberScoop, EdScoop and DefenseScoop. After earning his journalism degree at Virginia Tech and winning the school's Excellence in Print Journalism award, Billy received his master's degree from New York University in magazine writing while interning at publications like Rolling Stone.

Latest Podcasts