The NSA, the original primary developer of SELinux, released the first version to the open source development community under the GNU GPL on December 22, 2000.[6] The software was merged into the mainline Linux kernel 2.6.0-test3, released on 8 August 2003. Other significant contributors include Red Hat, Network Associates, Secure Computing Corporation, Tresys Technology, and Trusted Computer Solutions.
I'm not sure why that's a problem. The NSA needed strong security so they created a project to serve the need. They are no longer in charge of SELinux but I wouldn't be surprised if they still worked on it occasionally.
There are a lot of reasons to not like the NSA but SELinux is not one of them.
That's the trubble with the NSA. They want to spy on people, but they also need to protect American companies from foreign spies. When you use their stuff, it's hard to be sure which part of the NSA was involved, or if both were in some way.
The NSA has a fairly specific pattern of behavior. They work in the shadows not in the open. If they target things with low visibility so it is hard to trace. Backdooring SELinux would be uncharacteristic and silly. They target things like hardware supply chains and ISPs. There operations aren't even that covert as they work with companies.
In 98/99 they released a set of pdf, ini, and inf files with NSA official guidelines on creating an NSA approved NT4 server / workstation suitable for running in their internal environments from a scratch install. So I took a box and the NT discs and went home and hammered it for months. It allowed me in part to move from a boutique development Admin to working on the JSF X-32 at Boeing just two years later.
If they afterwards released it under a Free (Libre) Software licence then it's fine. The licence itself prohibites against any obfuscation or combination of obfuscated code with libre one. If you have the entire code, not just some part, as most companies do when go Open Source (not free software), then you don't have to worry about unknown behavior because everything is in the source.
If you have the entire code, not just some part, as most companies do when go Open Source (not free software), then you don't have to worry about unknown behavior because everything is in the source.
I mean if you have the entire source then you have everything to reproduce the program. Finding a malicious part does not only depend on the source but on the inspector, that is true.
But anyways having the entire code and not just the part that a company feels they may share is better anyways. Even if it's literally malware.
The free software community users depend on the community in order to detect malicious code. But at least there's a source code way of doing so.
If I tell you that this building has a structural deformation, having the possibility of accesing the architect blueprints and list of materials is better than just being able to go inside the building and try to search for it, no?
While they created a set of patches that would implement the security features that selinux provides, what was actually merged was the result of several years of open collaboration and development towards implementing those features.
There's general agreement that the idea that the NSA proposed is good and an improvement, but there was, and still is, disagreement about the specific implementation approaches.
To avoid issues, an approach was taken to create a more generic system that selinux would then take advantage of. That's why selinux, app armor and others can live side by without it being a constant maintenance and security nightmare. Each one lives in their little self contained auditable boxes, and the kernel just makes the "check authorization" function call and it flows into the right module by configuration.
The Linux community was pretty paranoid about the NSA in 2000, so the code definitely got a lot more scrutiny than the typical proposal.
A much easier way to introduce a backdoor would be to start a tiny company that produces some arbitrary piece of hardware which you then add kernel support for.
Now you're adding code to the kernel and with the right driver and development ability you can plausibly make changes that have non-obvious impacts, and as a bonus if someone notices, you can just say "oops!" And not be "the god-damned NSA" who everyone expects to be up to something, and instead be 4 humble keyboard enthusiasts with an esoteric set of lighting and input opinions like are a dime a dozen on Kickstarter.
Paranoia in the sense of being concerned with the ill intent of others, not the sense of an irrational worry about about persecution. Much like how the intelligence community itself is said to have institutional paranoia.
We saw a very sophisticated attack on Linux earlier this year with the XZ exploit. That stuff is terrifying and the sort of thing people should be worried about. SELinux is tame, by comparison.
It is also important to note that it is pretty easy to do surveillance these days. People care around cell phones and there are massive camera systems that can track someone with high detail.
I haven't looked at the keyboard drivers, or much Linux source. I never really had a reason to do a lot of C other than small microcontroller projects.
But I see this stuff and think of how awesome it must have felt to get a different keyboard working on an OS the first time. I have to do all this stuff with cloud, and api levels, and configuring CI/CD pipelines, and sometimes I get to write backend C# code or they let me play in the front end. Most of the time it's telling another team of developers what to do, and listening to our clients explain the problems and I have to figure out if we already have anything to fulfill at least some of those needs.
These drivers are the divine marriage of hardware that's not native to the machine that an OS is running on. It's so beautiful to read. You can visualize where the values enter a memory address, and bits get shifted or something is static so the keyboard always uses the right thing.
I mean, leaving aside their surveillance tasks, it's still their job to ensure national security. It's in their best interest to keep at least themselves and their nation safe, and considering how prevalent Linux is on servers, they likely saw a net benefit this way. They even open sourced their reverse engineering toolkit Ghidra in a similar vein
Ghidra was about hiring and cost savings. Its easier to hire when people already know your tools. Also people are more willing to use your tools rather than expensive ones if they can still use them when they leave (go into contracting). Also interoperability with contractors may improve.
I mean, it’s still Open Source, right? So it would be pretty hard for them to hide a backdoor or something??
Right but maybe it combined with other tools they have is what helps them with some exploit.
Like they figured out an exploit but needed SELinux as a piece of the puzzle. It's open source
and we can all read the code but we can't see the other pieces of the puzzle.
I mean, they almost certainly have built in backdoors like IME. When you can force hardware manufacturers to add shit, you don't have to think up convoluted solutions like that.
I maintain open source software on a much smaller codebase that is less security critical. We have dozens of maintainers on a project with about 3k stars on GitHub. Stuff gets by that are potentially security vulnerabilities and we don’t know until upstream sources tell us there is a vulnerability
People don't understand that the way a backdoor is usually implemented is not going to be obviously saying "backdoor_here", neither it will look like a some magic code loading a large string and unzipping it on the fly -- that's sus af. What you will see is some "play video" functionality that has a very subtle buffer overflow bug that's also not trivially triggerable.
This is also probably the reason why you lost your DARPA funding, they more than likely caught wind of the fact that those backdoors were present and didn't want to create any derivative products based upon the same.
Though this implies that the Department of Defense doesn't want to use compromised tools, since DARPA is DoD. NSA is also DoD.
I did some follow-up research and found that subsequent audits found no backdoors. They're either incredibly sneaky, or the person making these claims wasn't being entirely honest.
Do you know of any good comprehensive followup to this? A quick search shows me lots of outdated info and inconclusive articles. Do you know if they conclusively found anything or if there is a good writeup on the whole situation?
Not really. Do you know how many proprietary, company-specific extensions and modules there are of the Linux kernel out there?
Loads of companies choose not to contribute their stuff back upstream. I don't know why the NSA did originally in the case of SELinux, but I would guess it had to do with transparency, national defense and not carrying the burden of a module / fork solo. They were also not the only contributors even early on, according to the Wikipedia page
Also, if I recall correctly, there was no other option for MAC back them (no AppArmor or Tomoyo).
GPLv2 only says that people with access to the binary need access to the source code too. If they only used it internally they'd never have to make it public.
Do you have more recent information by Signal on the topic? The GitHub issue you linked is actually concerned with publicly hosting APKs. They also seem to have been offering reproducible builds for a good while, though it's currently broken according to a recent issue.
There was a "ultra private" messaging app that was actually created by a US state agency to catch the shady people who would desire to use an app promising absolute privacy. Operation "Trojan Shield".
The FBI created a company called ANOM and sold a "de-Googled ultra private smartphone" and a messaging app that "encrypts everything" when actually the device and the app logged the absolute shit out of the users, catching all sorts of criminal activity.
I have no proof, but I do have a small list of companies I actually suspect of pulling a similar stunt... perhaps not necessarily attached to the FBI or any other agency, but something about their marketing and business model screams "fishing for people who have something to hide"
For people interested in the subject. Read
This Is How They Tell Me the World Ends: The Cyberweapons Arms Race
TLDR current day software is based upon codebases that have houndreds of thousands lines of code. Early NSA hacker put forward an idea 100k LoC program will not be free of a hole to exploit.
To be a target of a 0-day you would have to piss off state level actors.
You wouldn't phrase it like that. Android is based on Linux, and selinux is part of the Linux security subsystem. Android makes use of selinux features, among others, for security sandboxing.
Android also runs each app as a separate Linux user (separate UID). That, combined with SELinux sandboxing and the Android permission model, makes it a pretty secure OS.
Why don't we do the Android model with the Linux desktop. We have immutable Linux but everything I've seen needs root to be available. I want a system where the core system can't be changed at runtime no matter what you do.
I have a feeling this is just looking for a clever way to say "but Linux isn't as secure as everyone thinks", which sure, yes. But also, not many people, especially knowledgeable people, are claiming that Linux is "secure".
And when it comes to "privacy friendly" that depends so much on what flavour of Linux you are using (Ubuntu? a minimal Arch? Tails?) that it's not really something you can make broad statements about.
Also which terms? You can't call yourself an MD, RN or an Attorney etc in US and many other countries if you aren't one. You can't market drugs that haven't been approved by the FDA. Also bastardisation isn't a justification for no regulation.