Skip to main content
CrankySec

What can possibly go wrong?

I was watching Daniel Stenberg's FOSDEM video about the curl project's approach to security, and that got me thinking a bit about how wild it is that an open source project operating on a shoestring budget can deliver such an ubiquitous piece of software with such quality. And that shit's written in C.

Watching the video, it's clear that Stenberg cares deeply about curl. Stenberg and the other folks maintaining curl are thoughtful, careful, knowledgeable, and dedicated to the goal of delivering quality software that runs on probably billions of devices. For free. Software that is used by other software, many of those commercial. In other words: some people make money from the work performed by the curl maintainers. That's not news, and that's true for a whole lot of open source software. It's a labor of love, generally speaking.

On the other end of the spectrum, you have this trend of just YOLO-ing code into existence without knowing anything about it. You can blame generative AI for making this popular, but the habit of yanking code from somewhere and plopping it in a project is not new. StackOverflow-assisted code has been around for a while, after all. If this is how you develop software, you shouldn't be calling yourself a software developer. That would imply that you share a profession with people who know what they are doing, and you don't. Typing and/or copying and pasting code doesn't make you a software developer, and every software developer worth their salt will tell you that typing code into an IDE is the easy part. Thinking about what you're doing, being careful, and making design decisions based on experience and a shared body of knowledge is the difference.

And things are getting insane. If you subscribe to any number of popular newsletters about software development, operations, cybersecurity, networking, et al, you'll certainly find yourself clicking some link that will take you to a GitHub repo for some software that says it will do something. And, since the barrier to entry for software development, distribution, and discoverability is so low and filled with the wrong incentives, this is bad news. Or good news, depending on where you sit.

The incentives are bizarre. Imagine, if you will, a newsletter for developers that goes out every weekday. Let's call it "The Source Material Was Too Verbose, I Have Time Constraints." You need to put something out every weekday because your sponsors would be mad if you didn't. You hire someone on Fiverr to feed some links to some AI bot that's tasked with summarizing the link so you can build your newsletter and send it out to your millions of subscribers. You are going to need a lot of links. And rest assured that the person "curating" these links is not doing any kind of review.

You start just feeding the AI some GitHub repos with tools made by who the fuck knows, and sending those out to your audience. Next thing you know, the newsletter you trust and love is telling you to paste your AWS IAM policies into some random Vercel app that will turn it into a pretty picture that may or may not be useful. Tomorrow you'll get a link to a Node.js app that will check if your API keys are strong.

That paints a terrible picture about the state of things: a lot of people don't know what they're doing, and the people in charge don't know what the people doing things should know. If your boss thinks Cursor can replace you, there's not much you can do because there's a fundamental conflict that cannot be resolved until your boss understands things at a level that's above "who gives a shit?". Don't count on that.

For those of us in cybersecurity or related fields, this is a challenge. The developers are just using whatever the hell they come across to build the software: copilots, agents, VS Code plugins, MCPs, libraries and tools made by developers who are just using whatever the hell they come across to build cobble together the software, etc. The Ops people are just using whatever GitHub Action they found on the street and | sh-ing shit left and right without looking. Defenders cannot possibly know what's even happening because nothing looks normal, so how can you look for deviations from a baseline? When everything blows up and sensitive data spills everywhere, nothing happens. I mean, you might get fired, but in the grand scheme of things, nothing happens. Shit, Oracle Cloud's data's been floating around for weeks and they are not even willing to acknowledge there was even a problem. They have a TikTok to buy so Lawrence can make more money to purchase the moon or something, no time for "cybersecurity".

It's a shit show, and I am terribly sorry you're in this field or trying to get in because not a whole lot of people want to have you around. For those of you who insist on doing this, here's my hot take: start thinking about this in a different way. I believe that the future of cybersecurity and data protection is very tightly coupled with extreme observability. The traditional security controls won't do anymore. You will need to know exactly what's going on everything that matters at all times. You will need to think a lot about ways to determine what normal looks like so you can catch the abnormalities. Worrying about the endpoint was already pointless (no pun intended) years ago, and we're entering the age of Negative Trust, because Zero Trust won't cut it.

Better brush up on that eBPF.