440 points | by speckx1 天前
The military is allergic to internal software. First of all you have to understand IT in the military is a different world far beyond your imagination because everyday they are under attack by the best nation-state cyber intrusion people on the planet. Your start up will never experience this and the mega corp, like Facebook will only just barely experience anything like this because it’s just not that important. The only commercial organizations that even come remotely close to what the military experiences daily are large financial organizations. Also, don’t forget the US military is the largest single employer on the planet, like 4.5 Walmarts.
So the military monitors everything at multiple levels. That also means locking things down to approved software lists at different levels. The people writing the security policies and/or performing the monitoring likely aren’t experienced software directors. They look at things like NPM and Maven and just see unlimited attack vectors from people who have no idea about security, and they aren’t wrong.
Then in the civilian and contractor space where do you put the code. If it’s paid for by the government then it’s owned by the government otherwise the contractor company wants to own it completely separated from the government so they can charge the government for it. Then consider if it’s a subcontractor not aligned to the financial goals of the prime contractor. Even when the government wants to do the right thing it’s complicated. As a software guy under a non managing co-prime I really don’t give a shit and say just give the government everything and it’s weird to see people try to throw up layers, especially since the government side is always less restricted because they almost always achieve vastly superior security accreditation on their infrastructure.
While I'll acknowledge that certain areas within the US military might experience the level of threat and security sophistication described, the broader landscape is fraught with legacy systems that struggle to integrate modern solutions or best practices. These outdated systems/workflows/bureaucracy often result in inefficiencies and vulnerabilities rather than providing superior security.
You never hear about the US military getting broken into, not because it doesn't happen, but because it is classified when it does. Avoiding public embarrassment is the No.1 use of classification, resulting in this type of belief in their competence. I'd put Facebook ahead of the US military any day of the week, and it ain't close (who incidentally are a payment processor).
Where was this depiction made by the OP? I only see descriptions of IT infrastructure that's under attacks of a nature and scale that not even software megacorps will ever see.
I worked at Microsoft, and speaking to their security team my impression was that MSFT is under persistent attack from nation states on a non stop basis, up to and including working to get government assets hired to work at Microsoft to leak secrets out.
Given the importance of AWS, I have no doubt Amazon is under similar threat.
Hell, there is a whole “Azure for US Government” product out there just for that, and that’s in addition to the usual AD/OneDrive/SharePoint/Windows/etc. suspects.
Anyway Linux has 62.7% share of servers https://en.m.wikipedia.org/wiki/Usage_share_of_operating_sys...
You're saying there's no need to hack Linux when it's easier to hack Windows, and therefore Microsoft has better security fundamentals as the providers of a less secure but more prevalent OS? I don't follow the argument.
(A) IN GENERAL.—This Act shall not apply to classified source code or source code developed primarily for use in a national security system (as defined in section 11103 of title 40, United States Code).
(B) NATIONAL SECURITY.—An exemption from the requirements under section 3 shall apply to classified source code or source code developed—
(i) primarily for use in a national security system (as defined in section 11103 of title 40, United States Code); or
(ii) by an agency, or part of an agency, that is an element of the intelligence community (as defined in section 3(4) of the National Security Act of 1947 (50 U.S.C. 3003(4)).
I dont believe it
Maybe not on the entire planet? I know it doesn't matter and it's not really anything but I guess that's not USDoD. But then again who believes Wikipedia anyway https://en.wikipedia.org/wiki/List_of_largest_employers. Besides even if it's true it's barely by a whisker and we shouldn't put further wars (i.e foreign freedom distribution enterprise) past the great nation of USA so that will change on a whim anytime. Cheers.
Also if this link is to be believed then 4 Walmarts (or even 4.5) will be really a lot.
https://en.wikipedia.org/wiki/List_of_largest_employers
That's actually kind of shocking to me given the difference in population between the countries.
Then what do they do? Aren't there tools to check for security vulnerabilities in the market like Sonar Qube? Wouldn't (or couldn't) the millitary run a beefed up version of that and developers can't check in code without ensuring 100% compliance to security vulnerabilities
And I encourage open source devs who get security compliance demands without patches or significant collaboration to ban the person filing the issue.
But on the other hand, open source or giving things out for free for people to try out and use is also a great way to find interesting friends who just want to build cool stuff together, and make new friends along the way :D That part is prolly the only reason why I like opensource so much, it attracts certain kinds of interesting people depending on the project you’re working on.
Parts of the US DoD do have more rigorous testing that is considerably broader in scope than commercial linters and such, and evaluates for threats that commercial systems don't consider. Many of these tests reliably break open source software. It is unclear how thorough or exhaustive these audits or tests are -- it can be quite opaque. For good or bad, having been through several serious security audits by multiple organizations, my software always came back clean so it is difficult to calibrate their sensitivity from the outside.
You'd be better off going back to Ada.
Entirely user/library serialization or fine cryptographic issues, and not the VM, no?
I guess someone who’s never worked there could say this. Facebook is a treasure trove of data for details on domestic and foreign politicians.
You writing this makes me believes you don’t work around or with the IC and speaking out of your lane.
On what assumption do you base this? Startups that have high research value don’t hit your radar as a target?
And Really? Any given startup? Also the OP used Facebook.
I am baffled at your sense of security in nation state activity. Read the 2012 annual report to Congress about China. They collect everything.
Resources even for nation states are finite. At minimum attention is a finite resource that limits ongoing operations. Active high value targets make sense: defense, infrastructure, finance and even to some extent media.
With that in mind, do you really think they’re interested in a startup that optimizes Google ads? Or how about postgres as a service with no clients of interest?
It’s not that I feel a sense of security but the low success rate script based attacks aren’t what I’m talking about here (or for that matter things like perpetual port scanning of the internet. Every entity seems to do this looking for holes), we are talking about active operations by skilled attackers. There is only so much of that to go around.
Yeah, this always struck me as so strange. I worked at a contractor where our customer really did own all of the deliverables, but I had peers who worked at contractors where that wasn't the case. To them, I was all like, "Guys. Our tax money is paying for this. Why are you rooting for pulling money out of everyone's pocket to massively enrich a few fat cats and sales guys in your company?".
Like, if you're of a "We're not going to give everything to the government" persuasion, then the reasonable way to handle that is to give the government the deliverables to do anything they want with, just so long as you're free to sell the unclassified components of it (and things derived from them) in the private sector to interested parties without interference.
The Defense Sector is very much like the private sector in that it has hundreds of companies/teams doing pretty much the same thing.
> Per the new law, metadata includes information about whether custom code was developed under a contract or shared in a repository, the contract number, and a hyperlink to the repository where the code was shared.
Sadly it doesn't sound like the law requires agencies to make the code publicly open source, it just requires inter-agency sharing (bill full text [1]). They only need to share "metadata" publicly.
[1] https://www.congress.gov/bill/118th-congress/house-bill/9566...
This is a good first step. The next would be sharing with states, municipalities and universities. Public sharing disperses a lot of IT responsibility that presently doesn’t exist.
At least that's my experience in a commercial setting: it's easier to publish something without restriction than to share it with a specific hardware or software partner only. The latter creates all kinds of questions around neutrality, applicability of NDAs, licensing, and so on.
What would be more interesting is to require all private companies who are doing US government contracts, especially the ones who are handling classified projects to do the same as these US agencies!!!
> The BRL-CAD source code repository is the oldest known public version-controlled codebase in the world that's still under active development, dating back to 1983-12-16 00:10:31 UTC.
[1] https://www.intel.com/content/dam/doc/case-study/enterprise-...
I’ve gone through “agile transitions” in government contracting, at a high level it starts out with a high concept idea of reducing lead times and increasing productivity. Then directives get handed down through layers of management, the decision is made to adopt Scrum or SAFe™, that gets handed down to middle management, who tailor the process in ways that specifically benefit themselves, and you end up with waterfall done poorly and with extra steps™.
What will happen is that there will be very loose definitions of source code and flexible definitions timing when code is released. If an agency does not want to share, they’ll find a way to evade, and still check off the box.
So the bar has been high to keep it private for $$$ reasons, but you could always keep it private for any other reason.
DOE Code is the program that ostensibly tracks the open source software, usually just through github organizations. OSTI is the division that tracks all IP and research.
The exemptions are extremely broad in section 4 of the act. I don't expect anything interesting to come of this reporting. Or for any money to be saved.
There are other things already shared publicly like NASA IKOS:
https://github.com/NASA-SW-VnV/ikos
That one gets far less attention from third parties than it should. If it could be developed into a general purpose sound static analyzer that handles multithreading, it would help to improve many other projects.
Example: Snowden revelations.
It’s the sophisticated version of “Don’t attempt any change” brigade’s position.
My observations from a lifetime in very large, cumbersome orgs is that improvement only comes from change and in highly dysfunctional, low-performance and low-ambition environments almost any reasonable change, supported by a really tiny number of engaged participants with a clue, leads to outsized positive step changes.
Even better, doing this as a sustained, tide-coming-in approach over several years can create more engaged people with a clue and slow transition to high-ambition, moderate-to-good performance cultures.
It’s worth the effort if you’re not doing it alone, and know that all the attempts pay off as part of a cumulative push. It changes lives both in the service delivery org, as well as those they’re supposed to support.
> tide-coming-in approach over several years
This phrase does a bunch of work, and seems to almost agree with the cynical perspective that individual small positive changes (or the more common failed attempt at the same) are futile. But if the difference between optimism and cynicism was only a matter of being patient and persistent, then we should be able to observe things getting better over time in a relatively consistent way.
Is that happening? Honest question, what large organizations can we point to that are better or more effective than they were 5, 10, 50 years ago? (And: for any situations where improvement happened, was it really a tiny number of engaged participants doing bottom-up change, or was it top-down change by some kind of executive decree?)
Youth without perspective will have a hard time answering maybe, but if the youth and wise old heads are both trending cynical at the same time then maybe the cynical position is actually true, and patience/perspective are simply not as relevant as the optimist would hope. My own experience is probably somewhere between youth/wisdom, and I tend to avoid large orgs as much as possible! But as an outsider, it looks like large orgs are all dysfunctional by default and only get more dysfunctional over time, with or without external pressures forcing that situation. Maybe there's a bureaucratic version of the laws of thermodynamics at work here.. the phenomenon of entropy isn't really cynical or optimistic or pessimistic after all, it's just the way things are.
I’ve been part of turnarounds where senior execs have said that the three hundred people here will lose their job if nothing changes. I still talk to some of those teams that transformed themselves and others, and made it.
To me the default for any government software should be OSS unless ministerially signed off.
But I was young and naive
However - as always, we have not come Far enough
Ghidra is a great example of this, and having this software be free has been of great benefit to the security community.
When responding to RFPs, the open-source stuff has an a higher level of scrutinty than the closed systems. Like, if it's open then you have to show it's good but if it's closed the vendor just says "yep, we are perfect" and the agency could move on. It feels like the agency, and the employees don't want any responsibility. But I've never seen anyone lose their government job from some incompetence.
It’s about job protectionism on the greater scale and being the subject matter expert who leverages his code base for promotions on the individual scale. generally agencies do not view each other as working for the same team. It can be very competitive when lobbying for resources from congress for your agency.
I work in gov and speaking from first hand knowledge. The culture is toxic. It’s broken and I can’t wait to see what changes Elon and trumps team propose.
> This version is being posted to GitHub as an experiment in collaborative tools for public engagement of government policy documents. Suggestions for changes or additions to this document by military or civilian personnel, contractors, and private citizens may be submitted as pull requests..
(2010) https://www.youtube.com/watch?v=WWt0YiXcEkE
Dan Risacher, from the DoD CIO's office, and open source security expert David A. Wheeler break down the history and ramifications of the recent DoD memo, which makes clear the Department of Defense's stance that open source is a viable, commercial form of software.
(2024) https://openssf.org/press-release/2024/10/29/openssf-expands... [Linux Foundation] OpenSSF recognizes the need for security education.. said David A. Wheeler, director, open source supply chain security at OpenSSF..Since its inception, more than 25,000 individuals have enrolled in this course material.
That means increased competition and reduced costs for the government.
> or using this to throw stones based on existing contract code quality
That means code review, which results in improved code quality one way or another.
I fail to see the problem here.
Reality, company 2 wins on cost and doesn’t understand the context of what was built or the environment it was built in. They don’t understand the costs as they didn’t pay them. Company 2 quickly proposes “full rewrite!” Lower cost labor they brought in can’t perform and quality degrades till we have (insert Gov software program here).
Or it doesn’t happen.
As others have highlighted, public funds should lead to public good and open source is a great way to increase that benefit.
[0]: https://www.forgov.qld.gov.au/information-and-communication-...
What sort of code would pose privacy risks if shared? That sounds like some nasty intermixing of code and data.
Yes, government contractors will gladly admit between the lines that their code is garbage and they largely rely on security through obscurity. And the information comissioner has agreed with them a few times!
Ironically, it's "sic", not "sik". Muphry's Law got you when you weren't looking. ;)
"If you write anything criticizing editing or proofreading, there will be a fault of some kind in what you have written."
EDIT: TIL this is a law of its own. Happy New Year!
As an example, think about what you might see in Excel formulas.
[0]: https://www.techdirt.com/2012/07/18/senate-not-concerned-abo...
In my country agencies use code as trade like "i will give you the code to generate holograms to by pass video KYC and you will give me men power to spread fake news".
Because there is no sharing, agencies spend a lot of money developing multiple versions of the same need, not all with great success. This bill should take you further in terms of quality and advantage.
Seems pretty clear they would.
Are all government contractors required to provide the source code for all developed applications? Or does this bill only apply to contracts where the deliverables actually include source code?
lol :)
What are the chances that something custom built for one agency is going to be at all useful to the custom needs of another agency?
Business logic tends to be <10%, the rest is just integrating stuff and piping data.
Where does the bill require interoperability?
https://www.congress.gov/bill/118th-congress/house-bill/9566...
This doesn't stop unscrupulous contractors from copyrighting code and charging license fees (see: most DoE code, with exceptions like NWCHEM). I've often wondered why this hasn't resulted in any lawsuits; I suspect the reason is that nobody really cares.
Do you know what law requires it?
> The report notes that the 2018 NDAA mandated DoD establish a pilot program on open source and a report on the program’s implementation. It also says that OMB’s M-16-21 memorandum requires all agencies to release at least 20 percent of custom-developed code as open-source, with a metric for calculating program performance.
Also, do govt software contractors worry much about AIs being trained on their codebases, attribution, etc.? Would that increase under this law?
None of this applies to state or local government: 17 USC 105 applies only to federal matters.
Attribution tends to be important to DOE people since they're usually academics working in the purview of Office of Science, and citations are how they get promoted.
I don't think anyone worries about AI training on their codebases, since LLM providers are not held accountable to any copyright enforcement anyway.
(Not copyright claims against commercial LLM companies).
I'm not aware of many contracts for bespoke software in the state government space; it's far more frequent that someone identifies a need and then develops a solution to bring to market.
If you, at home, pay someone to do work, what exactly do you own beyond the end product?
Obviously you keep personnel records private (unless there's some law/court case requiring it be open). Classified material is already classified, and is kept private regardless - but there's a good argument to be made that there ought to be automatic declassification of material after a set amount of time (perhaps 20-30 years). Declassifying material is good for the public, as the ability to audit the past prevents future abuses.
Not the parent, but I'm pretty sure that their intent is that it's the default that should be flipped. At the moment, all agencies default to confidential, and only share their work in particular cases; the proposed change would be of making all the work transparent unless explicitly classified.
As a good example of how this approach is implemented, see Gitlab, which share pretty much everything except personal data of their employees and customers.
Or there are other software secrets that we wouldn’t want state adversaries to see, like things that block your access under export control laws?
> The new law doesn’t apply to classified code, national security systems or code that would post privacy risks if shared.
Now, imagine if that exploit was instead intentionally planted by a foreign spy, targeting common use cases...
Just saying it (passing a law) will not make it happen, it will require a lot of work on everyone's part. I wish them luck.
If agencies share code, there will be a need for coordination of changes, someone makes extra money in the process? i.e. a small contractor doesn't get the change request because only some big player can handle the complexity of a mutli-agency contract?
The government is full of cronies, this bill was passed because someone has plans to use it to screw us all over.
I speculated this bill may feel good, but like all things in the government, was only passed because wealthy cronies have determined it is a net good for them in the long run.
What connection am I missing?