Take advantage of our Call for Papers! We can’t believe that all the devices, networks, services, and shiny things around us are completely secure. Once it got Wi-Fi, a SIM card, memory, or a processor there is bound to be an accident. It’s not just hunting rifles, jeeps, currencies, experts, and airplanes that can be hacked. There is more. Tell us!
Don’t let the IT crowd of today repeat the mistakes of our ancestors. Submit a two-day training and help to save some souls! We are especially interested in secure application development, intrusion detection/prevention, penetration testing, crypto & secure communication, mobiles devices, the Internet of Things, security intelligence, wireless hacking (Wi-Fi, mobile networks, …), forensics, and your workshop that really knocks the socks off our attendees! Drop your training submission into our CfP manager!
P.S.: Don’t send drones, just text. Thank you!
What is your first impulse when you see a fence? Well, we can’t speak for you, but we like to look for weak spots, holes, and way to climb them. The same is true for filters of all kinds. Let’s see what one can do to bypass them. Anti-virus software is a good example. At DeepSec 2014 Daniel Sauder explained how malware filters/detectors fail. Daniel was kind to provide an article for the special edition „In Depth Security – Proceedings of the DeepSec Conferences“:
„Based on my work about antivirus evasion techniques, I started using antivirus evasion techniques for testing the effectivity of antivirus engines. I researched the internal functionality of antivirus products, especially the implementation of heuristics by sandboxing and emulation and succeeded in evasion of these. A result of my research are tests, if a shellcode runs within a x86 emulation engine. One test works by encrypting the payload, which is recognized as malicious normally. When the payload is recognized by the antivirus software, chances are high, that x86 emulation was performed.“
Thanks to Daniel for the effort and the research.
What do NoSQL databases and cars have in common? You can find and freely access them by using the trusty Internet. Wired magazine has published a story about a remotely controlled Jeep Cherokee. Charlie Miller and Chris Valasek have found a way to use the properties of UConnect™ combined with (design) flaws to take full control of the vehicle . The threat is real since the car was attacked remotely by using a network connection. UConnect™ was formerly known as MyGIG™, and systems are available since 2007. It’s basically your entertainment system on steroids with added telemetry, internal commands, and network capabilities. Hacking cars by attacking the entertainment system was already discussed at DeepSec 2011. This is the next level, because cars have now their own IP addresses (and no firewall apparently).
NoSQL databases are very next generation technology (opinions are divided, we know). They abolish the rigid corset of relational databases, and give you more freedom. Incidentally they also do away with the pesky authentication most database systems require. John Matherly took a hard look at MongoDB. He used the Shodan search engine, and he found thousands of MongoDB installations exposed online. A closer look revealed a total of 595.2 TB of data exposed on the Internet via publicly accessible MongoDB instances. Missing secure defaults might be an explanation, but these databases were probably put online without a review of the configuration or putting security measures in place. Welcome to the Internet of Stuff!
Now what? One can argue about security design, patching systems, improving all kinds of components, awareness, and a lot more. This won’t stop devices getting connected to networks and being exposed to third parties (let’s phrase it neutrally). Bear in mind that the UConnect™ demonstration with the Jeep Cherokee is more than just pinging IP addresses and sending commands. It requires knowledge of the protocol involved, flaws that can be exploited, and is not readily available for everyone – yet. This will change.
The MongoDB issue is wildly different. The common ground is the repetition of history. The mindset of the designers (or the people implementing the system) seems to be stuck in the 1990s. Once you introduce networking for your appliance, telephone, car, or whatever you like to see interconnected, you have to address the consequences. The easiest way to do this is to get people on board in order to break things. If your shiny new toy gets broken in the lab, you have an advantage. Car companies do crash tests ad nauseam. You can crash test software and protocols, too. Information security experts do this all year. There are no excuses for blindly exposing systems to networks, really.
If you have ideas on how to make vendors listen and actually implement security as a process, let us know by submitting a talk or even a workshop. The Call for Papers for DeepSec 2015 is still open!
A sysadmin, a software developer, and an infosec researcher almost walked into a bar. Unfortunately they couldn’t agree where to go together. So they died of thirst.
Sounds familiar? When it comes to information technology, there is one thing that binds us all together: software. This article was written and published by software. You can read it by using (different) software. This doesn’t automagically create stalwart bands of adventurers fighting dragons (i.e. code vulnerabilities) and doing good deeds (i.e. not selling 0days). However it is a common ground where one can meet. Since all software has bugs, and we all use software, there’s also a common cause. Unfortunately this is where things go wrong.
Code has a life cycle. It usually starts out as a (reasonably) good idea. Without a Big Bang. Then the implementation process begins with a proof of concept (in the case of destructive software) or a prototype (which sounds a lot more friendly but may be even more cruel). Everything after this stage is your typical Software Development Cycle™. Depending on the culture of the company or organisation your code base either evolves by improvement or degenerates into a mess of patches, fixes, workarounds, and corporate Bullshit Bingo™ a.k.a. „structured quality assurance“. Success is another factor. Once the software gets widely used, there is a tendency to not change it as much as might be necessary. YMMV, of course.
Things gets really interesting when the security researchers find critical bugs. More often than not the applications breaks at the seams between components, or can be attacked by the dreaded „legacy code“ that lived in some function since the first prototype or early release. Now you can add your daily dose of hindsight, usually taken from a „Secure Coding“ workshop, and point out that all could have been avoided by proper maintenance. This is just another word for code refactoring. It is an essential part of software development. So much for the theory. Once you take a good look at the root cause of many CVE® entries, you cannot see refactoring in action. Of course this is a drastically simplified view of security-relevant bugs in code. There is a very prominent example in the form of the widely deployed OpenSSL library. The Heartbleed bug led to a refactoring revolution called LibreSSL. The OpenSSL team did the same and documented the goals in their roadmap.
Refactoring code is not easy. It is more than just fixing typos or using Yet Another Elegant Programming Language Feature™. It requires experienced developers who know what the code does, and who are very familiar with the tasks the code doesn’t do well. Every single executable comes with a free dose of hacks nobody knows about. Finding these workarounds is an effort worth spent. Fixing small issues might prevent big issues from gaining momentum.
Since the Call for Papers for DeepSec 2015 is still open, we are interested in hearing your code refactoring stories and their impact on security (i.e. we want you to break some things, but end with the princess and the prince getting together ☻). Looking forward to see your presentation on stage!
P.S.: Don’t pick on sysadmins, developers, infosec people, or everyone involved in software testing. We’re all mad here together, you know.
Anti-virus software developers made the news recently. The Intercept published an article describing details of what vendors were targeted and what information might be useful for attackers. Obtaining data, no matter how, has its place in the news since 2013 when the NSA documents went public. The current case is no surprise. This statement is not meant to downplay the severity of the issue. While technically there is no direct attack to speak of (yet), the news item shows how security measures will be reconnoitred by third parties. Why call it third parties? Because a lot of people dig into the operation of anti-virus protection software. The past two DeepSec conferences featured talks called „Why Antivirus Software fails“ and „Easy Ways To Bypass Anti-Virus Systems“. The Project Zero team at Google found a vulnerability in an anti-virus product enabling attackers to attack running systems. So reverse engineering is done all the time, not only by government bodies.
Frankly most software really invites reverse engineers by communicating without encryption or authentication with their (update) servers. Some protection software is just a botnet with corporate C&C servers, waiting for their commands to be extracted by curious researchers. You can even gather information about systems in a local network once you can take a peek at the network traffic and isolate the chatter from the anti-virus and intrusion detection software in place.
However, infosec researchers usually do not intercept emails from vendors or their developers: The chance of getting to know about a new 0day (or the indication that one’s own 0day has been found) are considerably higher when looking at anti-virus or intrusion detection companies. Samples of new malicious software are often sent to anti-virus companies for further analysis. Plus you get a warning as soon as one of the AV engines is vulnerable to a new exploit. It’s a kind of weather forecast, either legal or illegal, depending on the applicable laws. And it is a good example of valuable information well worth copying.
In turn this means that your network and your systems are part of this game. There is no virus or intrusion protection without a steady flow of updates on new threats. Given the fact that you most probably use a 7/24 Internet connection with your computing equipment, the door is virtually open. For reconnaissance, that is. Everything else depends on your additional defences and the willingness of the software vendors to use strong encryption, authentication, and secure code. Have a look into that. Your time isn’t wasted, and you might get ideas on how to improve your defensive skills.
We have some more translated news for you. In theory it is an article about policies and the process of law-making. In practice it concerns the use of encryption and everyone relying on service providers (mostly connected to the Internet, i.e. „cloud providers“). No matter how cool your start-up is and what its products aim to replace, information security will probably need a backdoor-free and working encryption technology as a core component. This is exactly why you cannot stay focused on the technology alone. Threats may come in the guise of new laws or regulations (think Wassenaar Arrangement). Matthias Monroy has some information about the official stance of the German government regarding the currently raging „crypto wars“. Enjoy!
Author: Matthias Monroy
Published: 17.06.2015, 17:09
The [German] Ministry of the Interior (BMI) took a clear stand in it’s reply to a brief parliamentary inquiry on the current crypto debate. The inquiry was triggered by, among others, statements of the heads of Europol and Interpol, who warn of an increasing use of encryption technologies. According to the head of Europol these are ”one of the main instruments of terrorists and criminals”. Previously, the EU anti-terrorism coordinator Gilles de Kerchove had demanded to force Internet and telecommunications providers to install backdoors for encrypted communication. Now the EU Commission is preparing a series of meetings with Internet service providers that shall lead to the creation of a joint forum, to ”provide room” for the ”concerns of the law enforcement agencies” regarding the new encryption techniques.
The brief parliamentary inquiry focused on the possibilities (of federal authorities) to ”circumvent, lever-out or disable encryption technologies”. In his reply state secretary Günther Krings referred to the ”Cornerstones of German encryption policy”, a federal statement released in 1999. It was adopted under the former German Chancellor Gerhard Schröder of the SPD-Green Cabinet and was meant as contribution to the then ongoing crypto debate, a time when the demand for installation of backdoors was very fashionable.
Furthermore, the cabinet decision “cornerstones of German encryption policy” from 1999 still endures. The digital agenda of the federal government includes the goal to make Germany the “no.1 encryption location”. The development and consistent use of trustworthy IT security technologies are crucial for businesses, government and citizens in today’s information society. Therefore, the specific weakening or regulation of encryption technology is not pursued by the federal government.
The by now 16 year old document states that the government ”does not intend to restrict the free availability of encryption products in Germany”. Instead it strives to ”strengthen the users confidence in the security of encryption”. Nonetheless, it says by way of restriction:
By spreading strong encryption methods, the statutory powers of the law enforcement and security agencies to monitor telecommunication may not be undermined.
In the current response the access of police, customs and intelligence services to monitor telecommunication is out of question as well – however, without mentioning tools like Trojan [malicious] software. But ”authorized bodies” have the right to know about communication content and in this regard encrypted communication will be treated no differently than unencrypted. To possible extent these ”authorized bodies” may ”decrypt legally intercepted, but user-side encrypted communication within the realms of technical possibility […]”.
In Germany there is no obligation to hand over keys or passwords of user-side encrypted communication. But, according to the BMI, keys found in a search, seizure or by ”demand to surrender” can be used to decrypt intercepted communication. This likely includes searching through computer systems.
Decryption of non-user side encrypted communication however, may be implemented by a request to the provider:
As far as telecommunication providers encrypt communication in transit on their networks, this encryption must be removed by the telecommunication providers before they transfer it to the authorities.
Like Europol, Interpol, the European Commission, and the EU anti-terrorism coordinator the BMI sees ”problems in the identification of offenders” due to the increasing use of encryption technologies.
That is why the Federal Government wants to expand the cooperation of the European Commission with Facebook, Google & Co.:
Therefore, from the perspective of the Federal Government, any dialogue with internet service providers, to look for ways to meet the different needs, in relation of data protection to danger prevention and prosecution, is welcomed.
We all rely on software every day, one way or another. The bytes that form the (computer) code all around us are here to stay. Mobile devices connected to networks and networked computing equipment in general is a major part of our lives now. Fortunately not all systems decide between life or death in case there is a failure. The ongoing discussion about „cyber war“, „cyber terrorism“, „cyber weapons of mass destruction“, and „cyber in general“ has reached critical levels – it has entered its way into politics. Recently the Wassenaar Arrangement proposed a regulation on the publication of exploited (previously unknown) vulnerabilities in software/hardware, the so-called „0days“. The US Department of Commerce proposed to apply export controls for 0days and malicious software. While the ban is only intended for „intrusion software“, it may be applicable for a wide variety of code and also publications about security vulnerabilities. The criteria for „intrusion software“ according to Wassenaar reads as follows:
Software ‘specially designed’ or modified to avoid detection by ‘monitoring tools,’ or to defeat ‘protective countermeasures,’ of a computer or network-capable device, and performing any of the following: (a) The extraction of data or information, from a computer or network-capable device, or the modification of system or user data; or (b) The modification of the standard execution path of a program or process in order to allow the execution of externally provided instructions.
The definition leaves a lot of room for ambiguity and speculation. The modification of system or user data may happen on a daily basis. You might even accomplish this with an office writing software. Apart from „normal“ software you definitely run into problems once you think about penetration tests. These procedures are often part of the security procedure around a network or a set of connected systems. When it comes to (information) security research you are definitely in trouble. A lot of research involves tools that break stuff. To quote from the README file of the network fuzzing tool ISIC: „ISIC may break shit, melt your network, knock out your firewall, or singe the fur off your cat.“ Definitely sounds intrusive. It may even be designed to defeat said protective countermeasures. However it is neither „cyber munitions“ nor the equivalent of Thor’s Hammer (or any other mythological wonder weapon you prefer).
The proposal harms information security research. A disclosed vulnerability can be dealt with. Disclosing information about bugs in software and hardware is a way to disarm a digital weapon (which is a very bad metaphor!) and to close yet another door for „intrusion software“. In addition bans or export controls create closed circles within the countries of origin where „dangerous code“ will be discussed and developed. Given the angle you look at, the proposal of the US Department of Commerce can also be seen as a way to develop domestic 0days and force researchers to keep these attack tools in the country – for whatever reason your imagination can come up with. Setting aside the speculation, the proposal is too broad to be useful in the current technological stage. Instead it is essentially a ban of all IT security products and tools. It harms defenders more than it deters attackers form using 0days on your infrastructure.
We strongly agree with Halvar Flake and recommend his blog article on the matter. It is aptly titled Why changes to Wassenaar make oppression and surveillance easier, not harder. Everyone dealing with security research doesn’t need new policies to make one’s life harder. Comparing code with weapons doesn’t help either. Providing a safe legal basis for security research is the way to go, especially with the proliferation of Internet-connected devices just around the corner.
Modern ways of communication and methods to obtain the transported data have raised eyebrows and interest in the past years. Information security specialists are used to digitally dig into the networked world. Once you take a look at buildings, geographic topology, and photographs of structures your world view expands. Coupled with the knowledge of ham radio operators connecting the dots can give you some new information about structures hiding in plain sight. This is why we have translated an article by Erich Möchel, Austrian journalist who is writing blog articles for the FM4 radio station.
Read this article for yourself and keep our Call for Papers for DeepSec 2015 in mind. If you have ideas how to keep an eye on the environment surrounding your information technology infrastructure let us know. Companies should know their neighbours too, don’t you think?
By Erich Möchel, original source fm4.orf.at. Published 17 May 2015.
The roof structures on the US and Britain embassies are nodes of a network for mobile surveillance, which covers large parts of Vienna.
The four hitherto known monitoring stations on these viennese roofs are not isolated “listening posts” (This refers to previous articles describing structures on the roofs of buildings in Vienna, written by the author).
These small sheds sitting on top of embassies and field offices are the four nodes of a network for mobile surveillance, which covers large parts of Vienna. This network is much longer in operation than one might assume. 2014 the camouflage covering the station on top of the US embassy has been completely renovated, two other stations show significant signs of weathering.
As topographical measurements have revealed there is a very good line of sight between the three known US stations. Broadband radio links are easy to implement with small satellite dishes, while the British station further below can only be reached from the IZD Tower located in Vienna 22nd district. This fits in with a key document of the “Special Collectіon Service” (SCS), a joint venture between the NSA and CIA, outlining a monitoring concept for mobile phone networks in metropolitan areas around the world, describing it as an autonomous radio network with its own data processing on site.
Such a monitoring network collects the key settings of radio cells, their control stations and other important parameters of the three mobile networks of Austria. This data forms the basis for attacks on selected devices, whereby the monitoring of mobile networks on site follows the pattern of fibre optic monitoring. In both cases, the captured metadata gets continually compared with lists of “selectors” or “identifiers”, in mobile networks mainly the IMSI and IMEI numbers of the SIM card or the mobile device.
The British station in 1030 13 Jauresgasse and the station on the IZD Tower show clearly visible signs of weathering and share the same characteristics as the US stations.
As soon as a mobile phone flagged by a selector list logs in to such a monitored radio cell, an alarm goes off automatically as well the following monitoring steps described in the key document of SCS. This automated analysis is mentioned as a core feature of the system. In Vienna this must have been in use since at least 2007.
The slides that describe the SCS system so clearly are from a conference called SIGDEV, but the title slide refers to internal training documents of the NSA back from 2007, which were re-worked for the SIGDEV Conference 2011. “Signal Developers” are departments of the intelligence services which are responsible for the development of new data sources. This also explains the relatively simple interpretability of this document. It aims to describe the operation system developed by the NSA and GCHQ not to insiders, but to the staff of partner services.
References to the name of this system however, can be found not in this document but in an undated, most likely later document from the Snowden document pool. Only in the glossary contained in a far more extensive document an appropriate acronym has been found: “Stateroom” seems fairly well chosen, referring to the “state room” – cabins on cruise ships, which offer a panoramic view. In addition, these small houses on top of the embassies of the US and Britain are considered to be part of their own territory, as they are placed exclusively on buildings with diplomatic status. Embassies are not extraterritorial by law, but have a similar de facto status, due to diplomatic immunity. …
The system has become sophisticated: from the “maintenance house” to the wide-band antenna systems, its features are now standardized and can be found in the capitals around the world. Since they form a network, the individual components must be coordinated through the use of standardized, programmable antenna systems, thus the same system can be used on any cellular network in the world. “SCS Modernization” is the title of one of the slides. Another lists the new stateroom features: “modern IT services and infrastructure to support a network-centric model work, improving safety and maintenance”. The latter was achieved by standardizing the equipment.
Standardization not only simplifies the logistics – the technical support is centralized – but also the training of the operating personnel. In the European headquarters of the SCS in Wiesbaden, where they currently build a “Consolidated Intelligence Centre”, even a “show house” for training purposes has been set up. By now we even know its exact size: with 20 square meters, its slightly larger than initially estimated.
A whole battery of small satellite dishes can be set up in one of these sheds. Primarily the system is not going after individual radio cells but their control stations. Base Station Controllers (GSM) or Radio Network Controllers (UMTS) in the urban area are typically in charge of more than a hundred radio cells; the whole net ѕegment is called “Location Area” and has the same function as a “Routing Area” on the Internet.
These control stations are permanently collecting metadata from all radio cells, listing location data, IMSI, last known radio cell, etc. of all phones, located in any cell in their respective “Location Area”. The list updates itself constantly because the terminals are mobile, hence, in motion. This generates significant amounts of data that must be processed.
These data are continuously captured by the stateroom stations and transported through the network to a central station for processing. Since stateroom works close to real time, the data in all probability gets transported via radio links, because the so called “Einstein / Castanet” antenna system of SCS is also capable of using these. Furthermore any radio equipment can be connected, including equipment for “Microwave”, in this case not meaning microwave ovens, but radio links around 2 GHz. Since the Einstein / Castanet antenna system is also suitable for transmission, microwave radio links can be built with the same universal antennae between two locations, visual contact is naturally a prerequisite here. Likewise, radio links of mobile operators can be tapped as well.
Most likely the evaluation of the collected data takes place at the NSA Villa, because usually military analysts are working locally on site. The villa in Pötzleinsdorf is a well-nigh prototypical location for such an evaluation point, whereas the IZD Tower in Vienna 22nd district, where the US representative of the United Nations officially resides, is the most likely location for the majority of the technical processing of the data intercepted by the staterooms.
Because some companies moved out, several floors directly below the US mission are vacant. In addition extensions and conversions are taking place, as several sources report unanimously. Here, too, an upgrade took place. In December 2013, after 12 years of operation, the central air-conditioning in the building has been replaced by using a cargo helicopter. The new cooling system, equipped with a cooling capacity of 4.2 MW, is by one third more efficient than the previous model. The need for cooling capacity in the IZD Tower must have risen considerably over the last few years.
Not only the local surveillance networks operate autonomously, the entire system also has its own headquarters near Beltsville, Maryland. At the headquarters of SCS topographies of mobile networks from the capitals of the world are systematically collected and regularly updated. This data is not only only available to the staff of NSA and CIA for attack purposes, but also to the intelligence services of the US Armed Forces and all other “agencies” of the US intelligence complex.
Regarding the SCS network in Vienna, the graph above only shows its backbone and even that not completely, because at least one station, in good visual contact with the British Embassy as well as at least one of the other three known nodes, is missing. Like all military networks this monitoring network acts according to the rules of the “network-centric warfare”, the official US military doctrine since the mid-90s.
Once you live in the Cloud, you shouldn’t spent your time daydreaming about information security. Don’t cloud the future of your data. The Magdeburger Journal zur Sicherheitsforschung published a new article by Armin Simma (who talked about this topic at DeepSec 2014). The Paper titled »Trusting Your Cloud Provider: Protecting Private Virtual Machines« discusses an integrated solution that allows cloud customers to increase their trust into the cloud provider including cloud insiders.
This article proposes an integrated solution that allows cloud customers to increase their trust into the cloud provider including cloud insiders (e.g. administrators). It is based on Mandatory Access Control and Trusted Computing technologies, namely Measured Boot, Attestation and Sealing. It gives customers strong guarantees about the provider’s host system and binds encrypted virtual machines to the previously attested host.
This article appears in the special edition „In Depth Security – Proceedings of the DeepSec Conferences“. Edited by Stefan Schumacher and René Pfeiffer.
The article is available for download.
Did you feed the cat? Did you lock the door? Did you switch off the Internet while on vacation? Did you wrap your wallet in tin foil? Did you buy this ticket to this conference you want to attend in November? How was it called?
We have a foolproof way to get over this constant feeling that you forgot something. Go to our registration web site, book a ticket to DeepSec 2015, print it out, and write all the important things you have to remember on the back! Your laundry list of All Teh Important Things™ will last until November 2015. After that you will come up with a new way to help you.
Looking forward to see you!
Given the ongoing demonisation of cryptography we have translated an article for you, written by Erich Möchel, an ORF journalist. The use of encryption stays an important component for information security, regardless which version of the Crypto Wars is currently running. While most of the voices in news articles get the threat model wrong, there are still some sane discussions about the beneficial use of technology. The following article was published on the FM4 web site on 25 January 2015. Have a look and decide for yourself if the Crypto Wars have begun again (provided they came to an end at some point in the past). Maybe you work in this field and like to submit a presentation covering the current state of affairs. Let us know.
The fuss about the call of the EU Council of Ministers for backdoors in encryption software will revive on Tuesday, when Gilles de Kerchove, the EU counter-terrorism coordinator, will elaborate on this highly controversial plans in front of the European Parliament. Explanations are necessary, because until now there were no concrete proposals, only diffusely formulated desires heard from the Council.
Meanwhile the presentation of the Committee for Technology Assessment at the European Parliament (STOA) on mass surveillance remained largely unnoticed. The STOA Committee strongly recommends safe “end-to-end” encryption – the very opposite of the demands of the Council of Ministers. The Austrian MEP and STOA Chairman Paul Rübig (EPP) explained to ORF.at why safe “end-to-end” encryption is a top priority for EU economy and why the promotion of “open source” software comes a close second.
“First and foremost we have to protect our trade secrets”, explains Rübig. The EU decision makers have become aware of this as early as the Euro crisis in 2008/2009, when “hedge funds and other speculators have bet on a massive destabilization of the Euro”. “The EU has been at least ripped off twice”, says Rübig. On several occasions it turned out that speculators knew about planned EU countermeasures in advance and were able to thwart them even before they came into force.
“Our guess was that current information from the state chancelleries has been passed on from intelligence to these groups,” says Rübig. For him that’s the main explosive fact about the news reports concerning NSA attacks on the mobile phone of German Chancellor Angela Merkel.
The two-part STOA study is fairly voluminous. It is therefore advisable to start with this four-page briefing. The whole study includes a total of several hundred pages with attachments and recommends no less than the active support of free encryption projects and independent platforms by the EU. This includes support of “security test programs of independent institutions such as the Electronic Frontier Foundation” and the possible creation of EU-own test programs for product safety.
In addition, the funding and support of important “open source” projects like “OpenSSL, TrueCrypt, GPG, or TOR” is recommended as well as the creation of an European “bounty hunter”-project to maintain such essential encryption tools properly and keep them flawless. So, the Committee for the Assessment of Science and Technology of the European Parliament calls for a bonus system for the detection of backdoors in encryption programs, while the Council of Ministers just discusses the targeted integration of these backdoors.
What the national interior and justice ministers actually have in mind yet remains a mystery; especially as telecom and Internet providers are obliged to allow access to encrypted data if presented with a search warrant since the 1990s. But of course these obligations only apply to providers own cryptographic applications – they have no access to the “end-to-end” encryption of their customers (such as VPN corporate networks for example).
Right now there are also no concrete demands on the part of the interior ministry, said ministry spokesman Karl-Heinz Grundböck on request of ORF.at. Grundböck emphazises that the Interior Ministry by no means demands to make the deposit of second keys a general duty . “We do not want to pre-empt the discussion at EU level and therefore currently can not go further into detail.”, Grundböck said. The questions to him were sent via GPG / PGP-encrypted mail, especially because the Interior Ministry maintains a PGP infrastructure for e-mail communication with citizens.
The stir about a possible obligation to key escrow – which in fact is a kind of encryption ban – refers to the “Crypto Wars” of the 90s, when strong encryption programs like PGP weren’t allowed to be exported. Web browsers such as Netscape or Internet Explorer were only allowed to be distributed with 40-, and then 56-bit encryption keys, a size which already at that time was ridiculously small.
Instead NSA and GCHQ tirelessly propagated state “key escrow” databases containing duplicate keys, alternately arguing to fight against child molesters, terrorists, or hooligans – justifications common to then current news articles. Because the permitted weak encryption of browsers made the planned roll-out of Іnternet banking and e-commerce impossible, the pressure of the banks, the rising Internet industry, and a ,in comparison tiny but vociferous, handful of civil liberty activists increased.
Around the turn of the millennium this unusual alliance managed to sweep away such insecurity measures, despite the fierce opposition of the secret services. First cryptography applications were released in France, Germany, Austria, and then all over the EU. Not quite surprisingly Great Britain was the last to follow suit.
The degree of restraint of the Interior Ministry (BMI) to speak about backdoors in encryption programs is understandable – obviously the Ministry has the necessary know-how and may not wish to speak against their better judgment. The only reason why the web interface of the database with the public keys of various departments of the BMI failed to get top marks in the [Qualys] SSL test, was because it accepts RC4 algorithms, which are said to be compromised. Usually RC4 algorithms are permitted to not exclude WindowsXP users with Internet Explorer 6.0.
This service is being offered for years, said ministry spokesman Karl-Heinz Grundböck on request of ORF.at. There’s no record of the number of encrypted emails received but in any case it is not an high amount. Maybe because there’s no easy option for users, like a safe upload form (HTTPS), and public keys are only available via email request. To accelerate this process, so that in an emergency, no valuable time is lost the ministry points out a shortcut:
Encrypted contact with the BMI is possible through this contact page: http://www.bmi.gv.at/cms/bmi_impressum/kontakt
You send a blank email and receive the address of the website securemail.bmi.gv.at.
There, you select “Key / Certificates Search” and type “email@example.com” into the form field.
The comprehensive study of the STOA committee is not the only recent study in this field. At the beginning of the year the EU agency for network and information security (ENISA) published a study entitled “Privacy and Data Protection by Design”, containing very similar and respectively complementary conclusions to the STOA report. ENISA also recommends secure encryption programs, which should be mandatory and already considered in the floor plans of system architecture, so that vulnerabilities can be prevented in the first place.
However if vulnerabilities are intentionally set, this is called a backdoor and the plans of the Council point in this direction. On Thursday, EU interior and justice ministers in Riga will informally meet to discuss measures against terrorism. But given the nebulous formulated section on encryption in the meetings documents, it’s not likely to expect further information on what is actually planned.
They want to explore rules to oblige telcos and Internet service providers, “to give relevant national authorities access to communications, for example through the transfer of keys,” it says in the documents of the meeting of Ministers. That doesn’t clarify, because in end-to-end encryption by the client, the provider does not have the key.
Precisely because current media reports talk of “Crypto Wars 3.0″, it should be noted that this now familiar term for the conflict between civil society and military about encryption in the 90s was not used by neither the participating civil society groups nor by the industry. This term has been coined much later, and it was the involved military intelligence who invented it.
Consequently, NSA and GCHQ named its subsequent programs to undermine cryptographic applications Bullrun and Edgehill – after two battles of the respective national civil wars. Therefore the military secret services, financed by the civil society to protect it, define their relationship with the very same civil society as a kind of “state of war”.
The Call for Papers of DeepSec 2015 is open! We are looking for your presentation and your in-depth training to add to our schedule. There has been a lot of activity in the past six months with regards to information security. Given the cultural and political impact of vulnerable code there are ample topics to talk about and to teach.
Cryptography has its place in the limelight since the high impact but with a cute logo. Getting cryptography right has been the problem of developers and academics since decades. Now everyone knows about it. So if you have some research on encryption, authentication, and secure communication in general, send us your thoughts along with your submission.
Protecting your infrastructure is harder than ever before. Once upon a time only your servers and classic clients used the network. Now your telephone, your refrigerator, air conditioning, the light bulbs in the office, coffee machines, door bells, and many devices have network access too. The Internet of Things (IoT) is slowly taking shape. IoT won’t disappear, so you have to adapt. How does this change your information technology architecture? Share your thoughts on this matter with the DeepSec audience.
Are you afraid of the digital age? Do you fear getting your data stolen? Germany may switch back to the good old typewriters to protect data from high-tech espionage. What do you do? How do you prevent spies from using your network and your systems as a drive-through? Let us know.
Software developers and security researcher have left the Golden Age of Science behind. Involuntarily. Now everything bit we do is literally being categorised as „cyber war“. There’s the current debate about the handling of 0days by the Wassenaar Arrangement impacts the IT security industry and academic research. The question is: Will the Internet kill you or will the Internet kill your research? We are used to publish security vulnerabilities and to talk openly about them. This has changed, and this change is reflected in current policies and laws. We like to hear your ideas about this matter.
Speaking of policies and laws, we also need to look into this direction. Information security has always been a very interdisciplinary field. Now it reaches far into politics, military, aspects of society, and culture. If you have facts about the interaction between infosec and the rest of our world(s), you should consider presenting at DeepSec in November.
We all put a varying degree of trust in devices, network, software, hardware, and algorithms. Have you ever questioned the way we handle trust in a digital world? Of course you can always go with your gut feeling, but this is probably not what you want when it comes to important decisions. We have entered a research project to take the trust in technology apart and put it back together. In case you have some opinions on this matter that can be turned into a submission for DeepSec 2015, let us know.
When everything fails, you can always use forensics. The pathologist knows what went wrong. Sometimes we have no other means to improve but to learn about what we should have done. The same is true for the digital age. If you are into IT forensics, we like to hear about what you have seen and what methods work best.
If you have questions, don’t hesitate to get into contact. We are looking forward to receive your submissions for DeepSec 2015 as soon as possible!
The BSidesLondon event is taking place next week. In case you have missed the tweets and don’t surf the web, check out the schedule. The keynote will shed some light on the gap between information security and technology already being used “out there” in the real world. It’s nice to spend months on solid designs and policies, but this doesn’t help you much when your users go shopping in the meantime.
Further presentations will tell you all about DarkComet, how to rob a bank, Android malware analysis, Point-of-Sale (POS) devices turning you into a billionaire, elliptic curve cryptography for the fearless, hash algorithm magic, infosec for the masses, and much more. You are really in for a treat.
BSidesLondon will feature a rookie track again! Do the rookies a favour and give them a chance. They have worked hard for their talk, don’t let them down. Presenting security issues publicly requires courage, so let’s give them all the support they deserve. We will attend every single rookie talk. Join us!
The DeepINTEL event in September will have a strong focus on a specific kind of intelligence. We will address the issue of espionage. Given the headlines of the past six months it is clear that companies are subject to spying. There is no need for euphemisms any more. Even with half of the information published on this matter, there is no way to deny it. Since the trading of data is a lucrative business, the issue won’t go away. So if you run a company or an organisation, then you might want to deal with risks and threats before they deal with you.
DeepINTEL is focused on security intelligence. Few CISOs and CEOs have a grasp what this really means. It is much more than doing risks analysis or threat assessment. As we have emphasized time and again, the Big Picture counts. You can counter some threats with a single device or a filtering rule. You cannot counter an concerted attack such as espionage easily. Spying involves many components and steps to retrieve the desired information. It also involves time. In order to detect malicious behaviour over the course of months or even years, you have to apply different techniques. This is where security intelligence enters the stage. You have to create your own tools and processes to get your very own early warning system. The knowledge of your company needs to be part of the analysis. There is no off-the-shelf software which does everything for you. Any vendor claiming to have the solution for your needs in this context is wrong. Of course there are a lot of good tools out there, but first you have to make up a clear picture of what your organisation does, where all relevant data sits, and how it flows.
If you are dealing with espionage in your day job or research ways to counter it, please consider the Call for Papers. Encrypted email preferred. DeepINTEL 2015 takes place on 21/22 September 2015.
We have been quieter than usual. We did a lot of preparations for the upcoming DeepSec events and were busy with research projects. In case you want to update your calendars, here are the dates to look out for.
The Call for Papers for the DeepINTEL is open. Please contact us via (encrypted) email.
The Calls for Papers for DeepSec and BSidesVienna will open soon.