The DMCA (Digital Millennium Copyright Act) is a powerful tool for copyright holders. Take down notices get served to many websites daily to remove infringing items, yet many are false positives. Will the DMCA harm cloud computing? I think its a good possibility.
I recently read an interesting article on SC Magazine about a security researcher who had her MediaFire account suspended for 36 hours because of a DCMA notification. The infringing files she had on the account for years, and were malware files that had been or were being researched by her and others. There is also the case of speeches from the recent political conventions been taken down off You Tube because of automated filters to prevent DMCA take down notices. The amount of false positives reported to the news outlets it a small portion of what actually is out there, but they tend to make big news.
So what does this all have to do with killing the cloud? The answer is quite a lot. If the filters and DMCA searches are conducted in a way that can breed a lot of false positives, such as just going by file names and sizes, then what is to prevent a DMCA notice and fight over a companies private files that have the same name as some other companies files? Better yet, what if something is named too similar to something from the entertainment industry? a presentation that uses music, hey there can be a DMCA takedown notice right there if a file scanner digs into it, or if you leave the name of the song in the filename.
The idea being that all these notices can help make people gun shy about moving or even using the cloud. Copyright is needed, yet has been blown way out of proportion in its longevity. Life of the artist plus 75 years is way to long, considering that copyrights were meant to foster innovation, not to allow someone to sit back on their laurels. Now we see that it can affect researchers which are reaching to the cloud to help analyze items in a file. This can affect not only the infosec area but other areas such as medicinal or other science research. All this because one is guilty until proven innocent. This can and will affect the future in more ways than we can see at this time.
Yesterday I said I was surprised by the lack of Windows 8 talk in the keynote. Today remedied that, plus some interesting security facts.
After everything that happened on the first day of TechEd, I was not sure what to expect from the keynote that kicked off day 2. I got a much closer seat to the stage when the room opened up and got ready for the speakers. Antoine Leblond was the main speaker and today was all about Windows 8. The information doled out was a lot of info that had been around for a bit along with a couple of nice nuggets. During the introduction of the keynote, Antoine made sure to point out that touchscreens were coming to laptops and PCs. Although he made it sound imminent, we all know that prices and the economy will really dictate how long until this technology was to be adapted.
When we got into the meat of the presentation, certain things jumped out at me. First was the swipe motions that Windows 8 accepted from a touch-pad on a laptop. Almost the exact same as what the Macbook uses at this time. It started me wondering about patent lawsuits, since the tech industry has gone sue happy. Then there was the performance enhancements and the addition of a hypervisor native to Windows 8. The did show Windows 7 running very smoothly in that hypervisor, which can be a nice point should you need to run both together. The did show a nice demo of Windows 7 open in the hypervisor with a windows 8 metro app running side by side so you could see and work with both at the same time.
They went on to talk about the performance improvements, how the convergence of home and work devices helped shaped what Windows 8 has become, and then into a beta app from SAP. We went over the Windows Store, which is organized very nicely by groups. Other points mentioned during the keynote (which should be available online to watch) included how your desktop will follow you across devices if you use a Windows Live ID to log into the machines, and my personal favorite, booting a machine off a Windows8PE image from a USB stick, which for troubleshooting and malware removal will be nice.
The next session I went to was 10 Administrator Security Mistakes, hosted by a MVP and PenTester from Poland named Paula. This woman knows her stuff and showed some things that can put the fear of security into you. How using the password rested on a DC stores the password in clear text in the memory and how easy it can be to get at it was one of the most eye opening demonstrations I have seen. So in order, the top ten we were given are:
Sin 10: Misunderstanding how passwords are used
Sin 9: Ignoring offline access
Sin 8: Incorrect access control (We were shown how Robocopy can be used to gain access to a folder which you have a deny access on)
Sin 7: Using old technology
Sin 6: Encryption, What is encryption (We were shown how HTTPS does not guarantee Encryption by a man in the middle hack which shows LinkedIn sends its passwords in clear text)
Sin 5: Installing Pirated software
Sin 4: Lack of Network Monitoring (Again shown an issue with the reset password feature in AD where is sends the password to the network broadcast address in clear text)
Sin 3: What you see is not what you get
Sin 2: Too much trust in people
Sin 1: Lack of documentation
The final session I went to was on using SysInternals software to fight malware. Mark Russinovich who created Sysinternals was the speaker. The seminar was a logical progression from the way to go about the removal process, to how to use different tools to discover different items. Again, this one was video taped, so it should be available soon for viewing. Needless to say even on Windows 8 there are gotchas, and some tools, such as msconfig, don’t have the information they used to. Between Process Explorer, Autoruns, Desktops and Process Monitor one should be able to find most if not all malware. Considering Mark said that 33% of web malware is not detectable because of time to get signatures out from the AV vendors, this seminar is a must for anyone dealing with malware removal.
Day 2 was a lot more intense and overall a lot better quality than day 1 for me. Tomorrow, there are more seminars to be had, and more things to do.
Extra charges for single online pay, 4G outages, the FTC starting to look at their business practices. Verizon, what have you done?
I was going to give a review of the Motorola Droid Razor today, but decided to push that off. See the Razor is available only through Verizon, and I noticed yet the start of another outage of 4G services this morning. Verizon has said these outages are growing pains, and were the 4G network brand new, I would accept that, but it is not. Verizon has had their 4G network up for just over a year, and should know how to handle growth. They were the ones who didn’t have the issues AT&T had with the explosion of smartphones. Of course that was CDMA vs. GSM. Now its LTE vs. LTE, and AT&T might have the advantage.
See both are using the LTE network, which requires the use of a SIM card. AT&T, whose network is still known for poor quality, and lots of drops, at least has a head start in dealing with the issues of a network that requires the SIM cards. I wish I had proof, but it seems that the SIM cards, or at least networks that require them, are not as stable here in the States as a network like CDMA which has no SIM card. (At the time of writing this, the 4G network just came back up after being inaccessible for an hour). It would be interesting to hear from someone on the differences between the two networks and why the ones that need SIM cards seem to be more unreliable.
Now this is on the heels of the FTC announcing it was probing Verizon over the $2 convenience fee it was going to charge and then pulled back on. Verizon’s statement is that even paying online has its costs. And they are right, there is equipment and software costs, maintenance on the systems, and hardening the equipment against hackers and other forms of data breaches. Still the costs are the same, whether for an automated system or if people pay individually. That is, unless they have to use 2 separate systems, or the company that is processing the payments is charging them an extra fee. Either way, there are other options to reduce the cost. If you think about it from a security standpoint though, the single payment, which I use, is a safer bet, not just from people knowing they have the money in their account, but from a security breach standpoint.
Just think about it. If you sign up for Automated payments, Verizon and the third party who processes the payments, both have your bank account or credit card information saved on servers. These servers are supposed to be PCI compliant. Even if they are, PCI compliance is a joke. Think of the banks (all of which have to follow at least PCI compliance) or stores (Which have to be PCI compliant) or anything that does online transactions, and how many breaches we hear of. Now think about how many breaches we don’t hear of, at least not immediately. Now look at single payment options, where you can choose not to save the payment info on their servers. Yes there are still problems that can arise from man in the middle attacks, spoofed SSL certificates, etc.. but once you make that payment, the info is not supposed to be stored anywhere. That means if Verizon, or their third party payment processor, has a security breach, your payment information should not be compromised. In reality it might just me being paranoid, but from a logic standpoint it does seem safer.
Now, Verizon did withdraw the $2 fee idea pretty quick, but expect to see it show back up again and again. The bigger thing Verizon has to worry about right now is the amount of bad press they are receiving. They need to remember that pissing one customer off means that customer is going to tell their friends and family, and eventually it can and will take a toll on business.