I wrote a comment to a blog post on Government Information Security Blogs - Cybersecurity Vs. Information Security - which are part of the site GOVinfos security. The comment aren't published yet but would be interesting for some of the readers of SEOTAktik and I also want to archived here.
A note to add to the comment is a general one. We all know of the problems that have been and the current constant lesser problems. I strongly believe that if you have had such problems for long that limiting the scope only regarding methods and technical that have been used for ten years or more and see the problem as only incorrect use of such is dangerous. That for sure is a part of the solution but we will never for the very possible reasonable cost see a change in the right directions before entities learn to be a head and that means understanding security not from a few limited perspective but from more of the area so other solutions and potential problems would be discovered earlier.
Different topics in the comment
- 0. Additional NIST resources
- 1. Why IT- and informationssecurity changed to cybersecurity
- 2. In the future we must expect more knowledge from security experts: entropy, graph theory, modeling, pattern recognition, statistics and advanced risk management (not the few trivial formulas)
- 3. NIST have done a very good job but problem might exist and the following year they should track them down
- 4. The security grid is more important than anything also - The is the heart of America
- 5. I do not feel the government IDS is mature enough to be used sharp
- 5.1 Potential problem I probably not handled
- 5.2. Potential II I think isn't handled
- 5.3. Potential III I think isn't handled
- 5.4. The problem of analyzing complex data
- 6. Additional resources regarding efficient network architecture and correlated attack pattern recognition
- 7. Additional resources regarding theory for statics, pattern recognition and statistics
Additional NIST resources
The NIST project Smart Grid Interoperability Standards Project creates a must needed new standard for security regarding the power grid.
It is called the smart grid due to interoperability between different provider entities and consumers which is one reason besides previous security breaches that make the security part of the standard extremely important) also give access to the current version.
The following press release regarding status should be read: NIST Finalizes Initial Set of Smart Grid Cyber Security Guidelines (September 2, 2010).
Comment made on Government Information Security Blogs
"Thank for a very readable net news paper. It is one of few I read from time to time. Especially I like to blogs which in short summaries point interresting things happening. You are also often good writers which is nice.
I made a comment which became rather long but then some times things go that way. My meaning was just to discuss history for the three terms. Here I also though pointed out to a possible security problem (or a set of such in this big IDS solutions the feds have ruled out and soon will rule out even more of).
1. Why IT- and informationssecurity changed to cybersecurity
I think the thing was at longer back we spoke general about IT- and information security. The problem with that was that it created a distance their some people worked with one or the other.
People who worked with information security often lacked a practical understanding making such often incomplete (also buying the wrong solutions) while people working in IT didn't often see things further than the practical functions they regarded for one application or server or firewall.
The result was and this is still normal (as we för example could read in the news regarding US-CERT who advice US on security while they still didn't manage to keep their IT-environment secure enough and US-CERT have access to Einstein hence risking a back channel into very sensitive data from later all of US government and more than that) that practical work and policy do not match up and also general that quite inefficient solution regarding cost is used.
By using the term cybersecurity one underlines that an understanding on both these areas in security area are the minimum to be able to do a good work. It doesn't mean that you need to a practical expert on it all but you gotta understand both part of it good enough not to make mistakes because you didn't learn enough.
A good example of both combined in a "policy" would for example be NIST standard for the power grind.
2. In the future we must expect more knowledge from security experts: entropy, graph theory, modeling, pattern recognition, statistics and advanced risk management (not the few trivial formulas)
Me I would in a longer term then we get the concept of cybersecurity working (which will take yeas) that one more perspective is general regarding the same way: basic understanding about information regarding redundans, entropy, graph theory, pattern recognition, risk management that goes further than a few simple formulas including statistical analyzing and methods to model and so on.
Just a tiny bit basic knowledge their would make often obvious that several net seldom used of architectures for security no matter how good they look will cause problems.
Regarding the NIST standard for the power grind I can see it is a good example on how both handle classical IT security and information security and so done with quality (hence something others can learn a lot from though I feel skeptical about details) both regarding people and architecture.
3. NIST have done a very good job but problem might exist and the following year they should track them down
But then I lack from it any and all of this kind of thinking. Given it is a grind with a lot of entities such analyzing both regarding the physical power, the security solutions, information passing affecting things and so on - on a general scope is what I feel they should do even if it might not be a natural part of the standard.
You can viewing things that they to your surprise get you missed several very dangerous potential problems and just for the power grind that isn't acceptable.
4. The security grid is more important than anything also - The is the heart of america
A lot of other stuff is easier to handle but the power grind must be top security and the policy though it will be a great change comparing to not long ago then power companies actually was infiltrated without knowing it should not to be viewed as some thing done and that then only get implemented. Instead more both practical and theoretical work have to be done I feel to make sure it want cause problems against a very skilled attacker with resources for example associated with political turbulence regarding a foreign country such as China.
The power grid bust be kept secure. Other projects and priorities are of no importance compared to the power grid. The power grid is what can used to cause war, it is was can be damaged as part of war or it can be used to just cause damage outside big politics. Never again must it be allowed for the security grind to have been infiltrated.
5. I do not feel the government IDG is mature enough to be used sharp
Besides the power grid the only area I feel that is critically (regarding only such I am not sure isn't handled good enough) is this Einstein (or what ever they finally called it) solution - the big IDS solution - the problem is that it connects to much - more or less all if you regard reaching a nod might make you pass out from it their and doing shit.
5.1 Potential problem I probably not handled
Also they put in advanced security solution in that as for example deep stack inspection and other very complex solutions cause that sounds good (though I still lack statistics prove that is better).. the problem with the deep stack inspection and other such complex solution is that reuse very old C code done for very basic protocols long back. I have programmed diectly with several of these during a few years including first at Ericsson implementing RFC 2704 i C and RFC 2035 in C - that is GSS-API och Keynote trust management.(besides any number of other things( I have also later built CA servers using also the same of older code we all used to build this things.
Such software are to start with extremely complex. For reach new protocol that got an implementation complexity increased even more and often in such taking with them the original code hence often breaking the path for updates their it being very much below other stuff forgotten. Hence I believe for example an ASN.! implementation once written early 1990 to function together with X.500 certificates on the standard of them at that time (perhaps doing a few optimications outside the standard) now sits in just about anything without anyone knowing cause it got put in other stuff by cutting out codes into such rather than installation the package.
Now I for sure do not know how many ASN.1 implementation in C that exists but their isn't many at all. And some I think reuse this one and it might very likely have problems (that is my memory from 2009 their we rewrote it adding a special memory handling unit to stop any overflows or similar. That ASN.1 by itself though is coded on an even more complex library that then I used it from time to time from 1999 to 2005 an immense numbers of potential security problems.
That wouldn't be a problem if people wans't often lazy but they are so we got this software solution that claim they can detect memory problems. Here though we are in the area of complex protocols and not one protocol but several their the graph to express the different path the negation and the action after can take is very complex. .Also it is not one protocol but protocol on each other several using different part of this problematic C-code without anyone remember their it came from. Hence this testing tool can play with stuff like this with random data, to big data, stress test or so on and find things to correct or find nothing but it doesn't make any difference at all.
If some one wants to attack he gets the source code and make the process trees, express the entropy and weight relationships, step the code by hand through interesting paths, and so until he mapped what happens up even regarding other entities he do not control directly but might be able to interact. From that he identify a state that is either undefined giving possibilities or is such that then several unlikely things happens in series after each other which might go on for weeks to reach it creates a security hole that then can be used.
Though I doubt it will take anything but several years. The current situation is after all today more or less as it been several years their simple solution get repeated year after year seldom solving much either due to having limits or being what no one implement in areas wide enough regarding population (humans, servers or what ever)..
If one unlucky that would include a program being transfered in for example the listening unit, firewall or what ever executing perhaps not to set that firewall none functional but spread another or several other programs out in the simple efficient network Einstein use to reach parts their extremely sensitive information is passed and at such collect it and returning it to be used in a targeted attack for example making a program act as a high up officier - perhaps even Janet Napolitano or a technician handling the energy grind.
Given this is the energy grind. One of though things incompetents and/or mistakes aren't allowed for miss risk aren't acceptable. I for sure noticed several RFC standards got updates done quite fast this ear but then I see not ones I felt more likely to be lateral problematic (that is problematic together with another but not necessary by itself) or just normal problematic, or for ASN.1 both,.
I say NIST should see to the good stuff they have done now get done proper and with quality while they are paranoid on heavy rude levels on any and all solutions them product companies comes and wanna sell them no matter it is deep stack inspection or anything else that includes parts very complex.
You ask to see the (all) software and you make a tree of it and you do not stop with the product but you track it back from their it came.. NIST do that NIST will find complex security holes - for sure more complex than anyone use to day but this is the power grid we talk about so it shouldn't be their. . No doubt. I know. I might the one in world not programming it in the project building that have longest experience with that code (though a few years ago).
In principle we have more or less the same problems and the same security principlas as 2000 and they still fail as often. Only difference is that we got more different solutions, more users and more attackers.
5.2. Potential II I think isn't handled
This is rather a group of problems. The one before is quite advanced for any one finding a path to misuse though I would be surprised if it isn't possible though I here would regard entities with resources as more problematic.
The same could partly be argued about this set of problems but then I also strongly feel some of them might be easier. Lacking me self an environment their I can set it but and also lacking time I doubt I will check it practical (and might actually require things I can not get for free) but this type of problems just as 5.1. is that the industry do not discuss at all (probably cause very few have knowledge enough in the right number of areas and their including protocol implementation, cryptanalysis, writing fast C could and what problems that might create their the current culture is buying units with functions one believes is ok but patching their we in this areas (both this one and 5.2) are in an area the product company lack knowledge of them self.
It is for me here harder to express 5.2. Both because I will it is less suitable public. Also cause it actually would take a week or two to sort old papers forward, set how an analyse environment to express relationships, weighted graphs and so together with probably CCS for parts of the inter protocol communication (if practical possible).
5.3. Potential III I think isn't handled
A third are easier to express that might have security problems that can have been missed regards implementation and use of algorithms such as:
1. Software for implementing trivial gaussian. Does they make some non documented assumption of the data? Well the ones I have tried and they are not many but almost five anyway do. Is that handled with defensive programming? I doubt it.
2. Maximum Likelihood (ML) Estimation and similar algorithms. Can we get some problem here regarding indata? And if not can a problem be expressed through the likely hood expressed causing problems in another less good written? Have it been checked. Or have people just feed it random garbage data which obvious are no good here.
3, Do you u parzen windows? Who knows. Depending on how you do it and what it interacts with problems can come.
4. Can your NNR if used be fouled by other stuff?
I mean classification is the main thing in an IDS. The more complex algorithms you use the better they will do their work on complex data but also the more sensitive they will be on special data perhaps combined with other security problems to force it to make the wrong classification.
This is not that hard you might think. It is just the area is new in security so people take it for block boxes that do stuff.
And now do you use syntactic recognition via parsing. You know it exist a reason why ones compiler complains time after time then u program: it is hard to write that type of language. Hence it is done automatic for this kinda things but that doesn't mean security holes do not come with them and even if they are checked by hand which I guess NSA did for u. we also have the security problems of the language. That is does it catch things or do we have special expressions in the data traffic causing odd states? This problems I beliave is the least likely in this part for this I guess NSA might have checked. The others I am less sure of.
5.4. The problem of analyzing complex data
Given some parts this will be used it isn't possible to let for example video and photo pass with out check. You would I guess generally by some thing to check such which of course might make it harder to make sure it isn't a wide open door to the full network of Einstein.
Me I say if parts need to pass movie clip u isolate them networks more or less and keep complex security solutions that can cause more problem than they solve away. Here of course I think partly on problems their they get confused by the data and waste time so important images do not reach it target fast enough.
Best Regards & Big luck with your nice internet magazine - one of the best.
Hans Husman
Previous work and activity:
Engineering of Physics, Sweden
Ericsson
The security magazine, International Data Group, Sweden
Security consulting mostly regarding security testing in telecom and for product companies.
Work now:
Owner of the journal Nyfikenvital.org
Working on a model for human language and creativity
Making a childrens version of GNU Octave to make more kids see how fun math is.
6. Additional resources regarding efficient network architecture and correlated attack pattern recognition
This article is in Swedish but the research papers and the presentation of them are written in English:
En enkel "topologisk" modell för att uttrycka dynamik i nätverk där meddelanden passeras
The follow solution to describe dynamics of messages in networks their the meaning of the information (disregarding routing and other regarding path) do not affect is extremely interesting.
First cause such (not exactly done the same way of course) can be used to follow up security solutions and other infrastructure show an energy efficient architecture their also other algorithms from graph theory and compression can have value (and perhaps is a better choose depending on size and type of use) because if the network do not and extra stuff is not well defined redundans that means the network is more insecure in a risk probability set both due to errors more likely have been introduced and also due to the rate work-speed unknown venerabilities show them self in.
Also it is less secure cause if the expression of the network isn't very efficient advanced traffic analysis want be possible and instead you are stuck with trivial things suck checking for bad packages and do little and slow correlation controls their that by itself easily enough get a target until it get turn of. A network build efficient lacks that problem cause it is more efficient built than entities expressing data to it.
A good first introduction to this much bigger area would be Learning Gaussian Tree Models: Analysis of Error Exponents and Extremal Structures (PDF) from IEEE Transactions on Signal Processing for which the follow MIT press release also holds value: Sizing samples - Many scientific disciplines use computers to infer patterns in data. But how much data is enough to ensure that the inferences are right?.
The ones who speaks Swedish might try to puzzle I bought earlier this year which illustrate the same principle (though I didn't prove it in math as Vincent Tan did which is much harder than show it visual.
Regarding theory for entropy and compressing we have a very good article at Technology review Code Quest - Claude Shannon, SM '40, PhD '40, threw down an irresistible challenge to those who would be pioneers of information theory. A young grad student soon met the challenge, but his solution languished in obscurity for decades. (MIT owns it) which I also made a comment to. It gives an introduction suitable for someone new to the terminology regarding entropy and effecient compression as well two important researchers: Shannon and Gallager.
7. Additional resources regarding theory for statics, pattern recognition and statistics
Readers speaking Swedish or using Google Translate can find some resources regarding the area disused in 2. In the future we must expect more knowledge from security experts: entropy, graph theory, modeling, pattern recognition, statistics and advanced risk management (not the few trivial formulas).
Informationsanalys för att beräkna kreativitet och modellera hjärnan
A short introduktion to entropy in the information set.
Att bedöma ekonomisk utveckling: Några råd
Exactly as in security it is the raider and a few persons few listen, and some researchers unaware of the world outside their room who regard several of the security problems we spoke about here. In economics it is the fast traders who use it while even entities as Riksbanken lacks anything modern to judge things from this perspective.
Medicinsk statistik och statistiska modeller
Recommends three good books regarding pattern recognition, statistics and error handling their the first two is of interest here and especially the first.
Kommentera