A large company with facilities all over the Asia-Pacific region opted to modernize. They'd just finished a pile of internal development that extended the functionality of a 3rd party package, and they wanted to containerize the whole shebang.
That's where Fred came in, about 9 months into a 12 month effort. Things hadn't gone well, but a lot of the struggles were growing pains. Many of the containers were built as gigantic monoliths. A lot of the settings you might need to do a Kubernetes deployment weren't properly configured. It was a mess, but it wasn't a WTF, just a lot of work.
The efforts of building the containerized, K8s deployments changed the company culture. The new rule was "container all the things", and all the things got containerized. Management was happy to be checking off new buzzwords, and the IT team was happy to have any sort of organization or process around deployments and their environment, since prior to this effort it was a lot of "copy this folder to this server, reboot, and cross your fingers".
Everyone was happy, except for one team, led by Harry. "Your containers broke our software," Harry complained. Now, it wasn't "their" software- it was a purchased product. And like many enterprise software packages, it was licensed. That license was enforced via a smart card and a USB dongle.
Which created a problem. Kubernetes is entirely about running your code absolutely abstracted from the physical hardware it's actually on. The USB dongle requires the code to be running on a specific physical device. The solution to the problem was obvious and simple: don't manage this one product via Kubernetes.
But that wasn't an option. "We use Kubernetes to manage all of our deployments," management said. One of the managers helpfully linked to an interview in a trade magazine where the CTO cheerily shared the benefits of Kubernetes.
"Right, but this particular product is ill-suited to that kind of deployment," Fred and his team countered.
"Okay, yes, but we use Kubernetes for all of our deployments."
And, as of this writing, that's where things sit. Everything must be containerized and hosted in Kubernetes. One software product doesn't play nice in that environment, but it must be wedged into that environment. Currently, the product lives on the same private server it used to live on, but this is a "stopgap" until "a solution is found". Officially it's out of compliance with company standards, and shows up as an evil red light on all the IT dashboards.