Tag Archives: security

Responsibly Bringing a new Cryptography Product to Market

Post Snowden, technologists have rushed a variety of “liberation tech” projects to market, making boastful claims about their cryptographic capabilities to ensure the privacy of their customers. These goals are noble but the results have sometimes been embarrassing.

We’re building a new crypto product ourselves: a high-level secure-by-default framework developers can use to build end-to-end cryptographic applications without writing crypto.

Here’s what we required:

  1. To be independently verifiable it must be open source
  2. Have a spec
  3. Have a threat model
  4. Have clear, well documented code
  5. Be audited by security professionals with a crypto background

In this post I’ll share how we’re going about #5. We’re committed to development in the open, including security review.

The first audit we could schedule was with 3 researchers from the Least Authority team. Among other reasons we chose them because they have deep experience building verifiable storage systems. For anyone in that market, Tahoe-LAFS is a must read.

Auditing is both expensive and hard to schedule, with leading organizations booked months in advance.  The best teams are not limited by their ability to sell their services but rather by their ability to hire and fulfill that work. Consequently there’s very little downward pressure on their rates.

To get the most from a security audit, it’s best to go in with the cleanest code possible. It’s like brushing your teeth before you visit the dentist. It’s impolite and ineffective to ask someone to puzzle over the subtleties of code you haven’t clarified [1].

We focused this first audit narrowly on a bare bones single-user (no collaboration or multi-user sharing) demo application built with the Crypton framework. Our goal was good coverage of the framework’s core fundamentals: account creation, authentication, and single-user data storage.

Unfortunately, at the time we could schedule the audit to begin, there were three issues that the Crypton team knew about but hadn’t a chance to fix or even document. The auditors independently discovered two of those three issues with a lead to the third issue (less severe) tagged [UNRESOLVED] in their report. Additionally they found three other serious issues unknown to the team. Overall, some of the best money we’ve ever spent!

Since the purpose of this post is to give clear expectations, I think it’s important to share real numbers and cleared this with Least Authority.

Zooko explained, “We gave SpiderOak a small discount on our normal price, and moreover we pushed back our other projects in order to get the work done for you first. We did these two things because we wanted to form a relationship with SpiderOak since you provide end-to-end-encrypted storage, and we wanted to support Crypton because it is end-to-end-encrypted and is fully Free and Open-Source Software.”

Our bill was $30,000, or about $5k/researcher per week.

We have a second audit with the nice folks at Leviathan Security, covering the multi-user features of Crypton, and we’ll share that report when it’s complete. In the meantime, here’s the report (rst, pdf) from the first audit by Least Authority.

Here are some of the resulting GitHub issues and pull requests to
resolve the findings. Issue B, C, D, and E.

The resolution for Issue A involves a switch to SRP based authentication. This was part of the longer term roadmap as it provides several additional benefits, but proved to be a nontrivial undertaking and that effort is still ongoing. Some attention is given to this implementation in the next audit by Leviathan Security.

Update: Zooko at Least Authority just published an article discussing their motivation for accepting the project.

Update 2: The originally published version of this post erroneously linked to a non-final draft of the report from Least Authority. That link is corrected; and the final audit report should say “Version 1, 2013-12-20″ at the top.


[1] Zooko shared a story about an experiment that was conducted by Ping Yee in 2007. The results of the experiment illustrate auditing challenges.

In short several very skilled security auditors examined a small Python program — about 100 lines of code — into which three bugs had been inserted by the authors. There was an “easy,” “medium,” and “hard” backdoor. There were three or four teams of auditors.

1. One auditor found the “easy” and the “medium” ones in about 70 minutes, and then spent the rest of the day failing to find any other bugs.

2. One team of two auditors found the “easy” bug in about five hours, and spent the rest of the day failing to find any other bugs.

3. One auditor found the “easy” bug in about four hours, and then stopped.

4. One auditor either found no bugs or else was on a team with the third auditor — the report is unclear.

See Chapter 7 of Yee’s report for these details.

I should emphasize that that I personally consider these people to be extremely skilled. One possible conclusion that could be drawn from this experience is that a skilled backdoor-writer can defeat skilled auditors. This hypothesis holds that only accidental bugs can be reliably detected by auditors, not deliberately hidden bugs.

Anyway, as far as I understand the bugs you folks left in were accidental bugs that you then deliberately didn’t-fix, rather than bugs that you intentionally made hard-to-spot.

Tomorrow is ‘The Day We Fight Back’ against mass surveillance

In Matt’s Damon’s AMA on Reddit last week, he was asked:

Hey Matt, your amazing monologue about the NSA in Good Will Hunting is probably more relevant today than it was when the film was first released. How did you come up with that scene, and are you at all surprised by the revelations on the NSA from the information released by Snowden? 

Here is the clip from Good Will Hunting:

Matt’s reply:

“Well, the first thing to that monologue is it’s safe to say that is the hardest that Ben and I have ever laughed while writing something. We were in our old house in Hollywood, in the basement of this house writing this thing and we were literally in tears because this monologue kept building on itself. We wrote it it one night and kept performing it back and forth, and pissing ourselves laughing.

You know, I was unaware, as I think everyone was, that they had that capacity. Snowden is literally changing policy. These are conversations we have to have about our security, and civil liberties, and we have to decide what we are willing to accept, and he’s provided a huge service kickstarting that debate…”

If you haven’t yet heard, tomorrow one of those conversations about our security, civil liberties, and what we’re willing to accept – it’s called The Day We Fight Back.

Thedaywefightback.org screen shot

“Together we will push back against powers that seek to observe, collect, and analyze our every digital action. Together, we will make it clear that such behavior is not compatible with democratic governance. Together, if we persist, we will win this fight.”



In the U.S.: Thousands of websites will host banners urging people to call and email Congress. Ask legislators to oppose the FISA Improvements Act, support the USA Freedom Act, and enact protections for non-Americans.

Outside the U.S.: Visitors will be asked to urge appropriate targets to institute privacy protections.

Global events: Events are planned in cities worldwide, including in San Francisco, Los Angeles, Chicago, Copenhagen, Stockholm and more. Find an event near you.

Add the banner to your site now: Grab the banner code on thedaywefightback.org. They’ve built special plugins for WordPress and CloudFlare users and also have a special version of the banner that pushes people to call over email.

Will you join us? 

Place Privacy First for National Cyber Security Awareness Month

National Cyber Security Awareness Month champion - SpiderOak

October is National Cyber Security Awareness Month (NCSAM). We believe you don’t have sacrifice privacy in our online world. Furthermore, we believe privacy is the best form of security. Building privacy into technology is the key to true freedom online.

In honor of NCSAM, SpiderOak would like to offer you 25% off our yearly plans.

For the rest of October, give your ‘+1 for Privacy’ as you backup, sync and share with this special promotion.

Visit SpiderOak.com/signup and use the promo code “plus1forprivacy” in your account settings for 25% off. That is only $7.50/month for 100% private cloud storage.

Current Users:

  1. Login to your account online.
  2. Go to your ‘Account‘ tab at the top
  3. Click ‘Buy More Space,’ and then choose ‘Upgrade My Plan.’
  4. Plug in the promo code plus1forprivacy, and choose which plan you want under Yearly Billing.

New Users (welcome!):

  1.  Sign up here
  2. Download and install the client
  3. Click  ‘Buy More Space’ in the client itself, or via the web portal (which will then take you to a new screen, where you need to choose ‘Upgrade My Plan.‘)
  4. Use the promo code plus1forprivacy and choose which plan you want under Yearly Billing.

2013 marks a decade for National Cyber Security Awareness bringing awareness to online safety and security issues and helping educate people about the best ways to protect their privacy online. Join us over the next couple weeks in acknowledging NCSAM as it emphasizes the roles and responsibilities each of us play in helping to create a safer digital world. We hope you make time during the month, and throughout the year, to take proactive steps to help safeguard you, and your friends and families.

Thanks for helping promote the importance of privacy!

Privacy VS. Security in a PRISM: The Important Difference

The events of these last many days certainly raise awareness around the integrity of data and the companies we entrust with it. Many of the articles and posts have poured over the impacts: the good, the bad, the necessity, the importance, the invasive, the threat, the martyr and so on. Given this dearth of commentary, I would like to spend some time writing about a finally emerging concept – privacy. And further – how privacy is substantially differentiated from security.

To begin, let’s review the definitions of these two words (according to Google):

Security – The state of being free from danger or threat

Privacy – The state or condition of being free from being observed or disturbed by other people

Of all the conversations and dialogue about PRISM, none have concentrated on the security measures in place at companies like Google, Facebook, Amazon, Apple, Verizon, and others. Why you might ask? Because this was not a breach of security. No one hacked into their systems. No one confiscated passwords. Rather – according to reports – these companies willingly complied. [Note: It would be appropriate to draw attention to NSA's security breach in light of Eric Snowden's ability to access and confiscate these documents.]

If the world were oriented around privacy, the ability for a 3rd party provider of web-based services (such as Google or Facebook or Dropbox or SpiderOak) to access the plaintext data is removed. In other words, privacy takes away the ability to access the data in a meaningful way such that it cannot be supplied to government agencies or stolen under the threat of hackers.

We are not now nor have we ever suggested that there isn’t a need for security; in fact, security is absolutely critical. And for many implementations of  various services, privacy is not applicable. However – in the world of conversation and creation of personally owned content from photos to chat to calls to spreadsheets to documents – privacy is absolutely a critical component that can be achieved.

My hope is that we – as a society – will now start asking the question: Why? Why do companies have access to my photos and documents and chat conversations? Is it a necessary part of the service they are offering? A convenience for me?If yes, what are these companies doing to keep my data private? And are there alternatives if I do want real privacy? From the NSA? From the company? From anyone?

This dialogue is critical and I am very glad to see the word ‘privacy’ start to weave its way into conversations. Further, that the public is being educated on the important difference between privacy and security and – hopefully – we all can start making choices accordingly.

For more information on this topic, please visit ZeroKnowledgePrivacy.org and/or watch the explainers below on Privacy VS. Security and the important role of the Privacy Policy .

Screen Shot 2013-06-11 at 12.56.53 PM

AMA: Interview with Cryptographer, Computer Security Expert Jon Callas

Jon worked on Apple’s Whole Disk Encryption, PGP (Pretty Good Privacy) Universal Server, co-founded the PGP Corporation, is former CTO of Entrust, and current co-founder and CTO at Silent Circle (Global Encrypted Communications). As an inventor and cryptographer, his designs of security products have won major innovation awards from The Wall Street Journal and others.

Last week, you submitted your questions for Jon Callas, one of  the world’s most respected and brilliant minds when it comes to software security and privacy. We chose five of them, which we sent to Jon. These are his answers.

1. How did you become a security expert / cryptographer?

A long time ago, I worked at the best computer science grad school there was — VMS development at Digital Equipment Corporation. One of the great things there was that I got to work on a wide variety of things, from graphics to schedulers to memory management to operating system security. A lot of the problems we had to deal with at the time are still relevant issues. I did a random password generator among other things, and I still use that for my own passwords.

When DEC fell apart, like many people, I started a startup with a number of friends. We built a system that let you do meetings as well as play games, socialize, and collaborate. It got rave reviews. The venture capital people said to us, “This is amazing! I want to invest in this in ten years!” That was when I started getting into cryptography. People didn’t want to do collaboration on the then very-early Internet without encryption. There was no SSL at the time, either.

So I went to the second ever RSA conference to learn to do enough cryptography to protect our network. I ended up sitting next to a guy named Bruce who had just written a book called “Applied Cryptography” and he had a bunch of them in a duffel bag with him, so I bought one. I may have bought the very first copy; I know I was the first person at RSA who bought one. I asked him to autograph it, and he said, “I can’t deface a book!” I replied that it’s not defacement if you’re the author.

After we got tired of throwing our money into our startup, I went to work for Apple in the Advanced Technologies Group and worked for Gurshuran Sidhu, who was the inventor of AppleTalk, and shipped the very first crypto built into an OS, called PowerTalk. It failed for being far too early, as well. One of its pieces, though, was this password manager called The Keychain, and I claimed that it was a great thing. While it was hardly perfect, it encouraged good password use, and that was better than anything else. So Bruce Gaya and I hacked The Keychain so that you could run it without the rest of PowerTalk, and thus rescued it from oblivion. The present Keychain on Apple products is completely and utterly rewritten, but I’m proud of saving it. I also built a random number manager for Macs that’s now lost to the mists of time.

That was the worst time to be working for Apple, the year before Steve Jobs came back. I named all my computers for things in The Hitchhiker’s Guide to the Galaxy, because as I said, having been through DEC’s collapse I felt a bowl of petunias (“Oh, no, not again”). When SJ came back, we heard a lot about what his plans were, as he and Sidhu were old friends. We knew that he was planning to get rid of all of ATG, so we wondered what to do. Sidhu wanted to start a startup, but none of us had any ideas we really liked. I could have easily gone into the OS group. A friend of a friend said that Phil Zimmermann’s PGP Inc was looking for server architects, and I interviewed there and got an offer. I thought it was a great way to do fun things and change the world for the better, so I went there. That was a great place to really become an expert.

2.  Are there any localities where it is illegal to encrypt calls, text messages, or emails?

Maybe. That’s not a good answer, is it?

In civilized countries, the answer is no. I might even go so far as to say that the places where it’s not legal or even expected are pretty tightly correlated with civilized countries. Repressive governments often try to restrict crypto. I’m sure Syria’s got it’s opinions, but I’m not an expert on Syrian law.

There are places where there are restrictions, but they are also so filled with exceptions that it’s hard to give a definitive answer. For example, China has import restrictions on cryptography. But there are exemptions for non-Chinese doing business there or Chinese people who are doing business with other countries. I am also nothing like an expert on Chinese law.

My rule is that I worry about the laws of countries that I want to operate in. I need to know about them, there. Other places I just ignore.

Most often, even in repressive countries, they aren’t worried about the crypto as such, they’re worried about what the people are using the crypto for.

 3. What are you working on right now that has you the most excited?

On a large scale, it’s Silent Circle. The biggest problem we’ve always had with crypto is that it’s hard to use. Usability is key because if it’s hard to use, then people use insecure systems. They don’t stop talking, they stop being secure. So your security has to fade into the background. It has to be ignorable. But it also has to be there, as well. That’s a paradox.

We also have learned one of the best ways to make security workable is to have it run by an expert staff. So the question is how to have an expert staff running the security and privacy for people who need it and yet the staff can’t undetectably compromise the people using the system. We have a lot of innovative things we’re doing to make the security fade into the background and yet be there.

On a small scale, I’m taking my old password generator from VMS and making it into an iPhone app. I was doing a lot of work on it before Silent Circle as a hobby, and I really ought to finish.

4. As an expert on encryption do you see a natural relationship between encryption and the law? What’s your stance on how encrypted data should be treated when there’s no idea what it may contain? In some countries there are what I consider very severe key disclosure laws and I wonder if there will ever be a duress scheme or method of deniable encryption that could be so perfect as to make the laws moot.

I think it’s an unnatural relationship between encryption and the law. All technologies can be used for good or ill. It’s true for fire. It’s true for just about anything. Encryption, interestingly, is rarely *directly* used for ill. Yes, there are data ransom schemes that use encryption for ill, but that’s not what people are concerned about.

It’s part of our belief in human rights that we believe in the right to be left alone. Yet many people lose their nerve when it comes to privacy technologies on computers and networks. I think that’s an artifact of the fact that we’re comfortable with door locks or window curtains, but every time someone thinks about encryption, the James Bond theme starts playing in their head. That’s an artifact of the relationship between encryption and disagreements between nation-states. With the Internet and computing everywhere, not using encryption is like having an unlocked house with no curtains.

“With the Internet and computing everywhere, not using encryption is like having an unlocked house with no curtains.”

My stance on encrypted data per se is that it’s data. Everyone has reasons that they want something to be private. Everyone has things that *must* be private, like their own records or someone else’s records, which usually *must* be protected. This might have been an interesting debate way back in the 1900s, but it isn’t any more.

I don’t know what to say about key or data disclosure laws. In the US, there’s movement in the courts towards protecting encrypted data in some way or other. It’s all revolved around passwords in specific, but the real issue is a Fifth Amendment issue. Relatively few countries have equivalents of the Fifth Amendment.

But the UK, for example, they don’t have protections against self-incrimination. As a matter of fact, we have one in the US *because* they don’t have one there. They have a disclosure law, RIPA. I think its application has been pretty embarrassing, as I can’t think of a place where it has been used that didn’t do much more than make the defendant more sympathetic.

I am not a fan of deniable encryption and personally, I think it’s impossible. Deniable encryption seems to me to be predicated on the idea that your attacker is either a nice person or stupid. Stupid in the sense that you are managing to hide the fact that you’re using deniable encryption. That predicates that either you’re using something completely custom, or they don’t realize that the deniable encryption is there. That’s what I mean by stupid — you’re pulling a fast one on them and they don’t know it. By being nice, they know you have deniable encryption and yet they’ll say, “Well, I guess if we can’t *prove* you have something encrypted there, I guess you don’t!”

A couple of years ago, I was chatting with the customs agency of a civilized country. I asked them about TrueCrypt and its deniable disk volume. They said, “Oh, we know *all* about TrueCrypt!” One of the guys I talked to added, “If we see you’re using TrueCrypt, we just ask you for the second password.” I asked what happens if someone doesn’t have a second volume and they replied, “Why would someone do *that*? I mean, that’s the whole point of TrueCrypt, to have a second volume. What kind idiot would install TrueCrypt and not have a second volume?” We chatted some more and one of them said, “Look, we don’t look in someone’s laptop for no reason. We have better things to do. If we’re asking for your computer, it’s not because they had a piece of fruit in their bag. If we find special encryption, we know we’re on to something.” I asked again about someone who *doesn’t* have a hidden volume, and they said that you’d have to sit in a room for a while, until you convince them you don’t.

This is the real issue, I think. If you’re in a nice jurisdiction — one where you can say, “Look, I’m an honest person and I have encryption, and no I’m not going to tell you my password” then deniable encryption might work. But if you’re in a jurisdiction where they aren’t nice, then you’re actually more at risk using something that makes you look like you’re up to something.

Ironically, this is an effect of the fact that we’ve succeeded in making encryption normal.

 5. What is your favorite movie?

There are relatively few movies that I’m willing to watch more than once. I’m apathetic about special effects, but a sucker for great dialog.

One of the very few movies I can watch over and over is The Princess Bride. One of my favorite lines to live by is, “Nonsense. You’re only saying that because no one ever has.”

Thanks Jon! If you are interested in learning cryptography, we recommend reading his PDF, An Introduction to Cryptography. Otherwise, be sure to follow or like Silent Circle to stay in stride with their efforts and support their work in encrypted communications.

Exploit Information Leaks in Random Numbers from Python, Ruby and PHP

The Mersenne Twister (MT 19937) is a pseudorandom number generator, used by Python and many other languages like Ruby, and PHP. It is known to pass many statistical randomness tests, but it’s also known to be not cryptographically secure. The Python documentation is clear on this point, describing it as “completely unsuitable for cryptographic purposes.” Here we will show why.

When you are able to predict pseudorandom numbers, you can predict session ids, randomly generated passwords or encryption keys and know all the cards in online poker games, or play “Asteroids” better than legally possible.

Many sources already showed that it’s easy to rebuild the internal state of the MT by using 624 consecutive outputs. But this alone isn’t a practical attack, it’s unlikely that you have access to the whole output. In this post I’ll demonstrate how to restore its internal state by using only parts of its output. This will allow us to know all previous and future random number generation.

With every 32bit output the MT directly exposes 32 bit of it’s internal state (only slightly and reversibly modified by the tempering function). After each round of 624 outputs, the internal state of the Mersenne Twister is “twisted” itself: All bits are XOR’d with several other bits. In fact the Mersenne Twister is just a big XOR machine: All its output can be expressed by an sequence of XORs of the initial state bits.

Python always combines two outputs into a 64bit integer before returning them as random integers. So each call of random.randint(0,255) gives you only 8 bits out of two 32 bit Mersenne Twister outputs. Since the tempering function already mixed the 32 bits outputs, it’s not possible anymore to directly recover internal state bits out of only the 8 bits.

I was curious if it’s hard to recover the internal MT state by using only the output of a function like this:

def random_string(length):
    return "".join(chr(random.randint(0, 255)) for i in xrange(length))

Since the internal state of the Mersenne Twister consists out of 19968 bits we will need at least ~2.5KB of output to recover the internal state. In fact I needed ~3.3kb, probably because of redundant output information. Also possible is a bug in my POC implementation :)

You can find the result on github.

How does it work?

First I named the initial state with variables s0…s19967. The initial state looks like this:

Internal state bit Value
0 s0
1 s1
19967 s19967

Now the first output of the Mersenne Twister is a combination of the first 32 bits (combined by the tempering function):

Output-Bit Value
o0 s0 xor s4 xor s7 xor s15
o1 s1 xor s5 xor s16
o2 s2 xor s6 xor s13 xor s17 xor s24
o31 s2 xor s9 xor s13 xor s17 xor s28 xor s31

same for the second output:

Output-Bit Value
o32 s32 xor s36 xor s39 xor s47,

But we can only observe eight of these bits, because random.randint(0,255) exposes only this portion of the output.

After 624 outputs, the internal state of the Mersenne Twister is “twisted” around. We update our internal state as an xor-combination of our old indices.

Internal state bit Value
0 s63 xor s12704
1 s0 xor s12705
19967 s61 xor s62 xor s5470 xor s5471 xor s18143

The outputs look now more complicated now, because the state bits are an xor-combination of the initial state:

Output-Bit Value
o19968 s35 xor s38 xor s46 xor s63 xor s12704 xor s12708 xor s12711 xor s12719

After 3.3 kb this list contains about 40 variables.

Now we have a big list of output-bits and how they are made out of an xor-combination of the original state. A big system of equations that we can to solve! This is done as you learned it at school: Here’s a simple example for 3 bits.

Given this equations system:

o1 = s0 xor s1 xor s2
o2 = s1 xor s2
o3 = s0 xor s1
First we solve s0:
o1 = s0 xor s1 xor s2
o2 = s1 xor s2
o1 xor o2 = s0
With this solution it’s easy to find solution for s1.
o3 = s0 xor s1
o1 xor o2 = s0
o1 xor o2 xor o3 = s1
And finally for s2.
o2 = s1 xor s2
o1 xor o2 xor o3 = s1
o1 xor o3 = s2
o1 xor o2 = s0
o1 xor o2 xor o3 = s1
o1 xor o3 = s2

Now we know how to recover the 3-bit state out of our 3 output-bits:
s0 = o1 xor o2
s1 = o1 xor o2 xor o3
s2 = o1 xor o3

However, in reality we have about 26,000 equations with 20,000 variables.

If you want to try it yourself, you can download the the result of the solved equation together with a test-program on github.

Further notes

Since the Mersenne Twister is highly symmetric, it’s probably possible to find some shortcuts or a fully mathematical solution for this problem. However, I implemented the straight-forward solution since it’s easy and reusable.

Python seeds the Twister with only 128 bits of “real” randomness. So theoretically it’s enough to know a few output bytes to restore the whole state, but you would need an efficient attack on the seeding algorithm since 128 bit is too much for a brute-force attack.

However, other implementations use much less randomness to seed their random number generators. PHP seems to use only 32 bits for seeding mt_random, Perl also uses only 32 bit (but another PRNG). In these cases it’s probably easier to use a brute-force attack on the seed.

It’s time to kill ‘online’. And buy clean milk.

As someone who has been ‘online’ since the early 90′s, listening to the emerging conversation around privacy, security, and integrity makes me want to flip a virtual table.

Having managed and built such sexy things as ‘direct marketing and selection systems’ for longer than I care to admit, I can honestly say that the argument against the silent collection of user data as being one that “degrades the experience for the majority of users” (Article link: Yahoo will ignore Do Not Track for IE10 users) is bullshit – pardon my frankness. A more honest description would read somewhere along the lines of ‘we make lots of money selling and distributing user data because it costs us nothing and is worth a lot of money’. (Please don’t sue me!)

So the question remains: why are we still living in a world whereby every time we visit a website the operators are silently – and in some cases without express consent – gathering all sorts of information on our location, previous shopping habits, age, demographic and a slew of other preferences?

To display the vast differences between ‘online’ and everything else, let’s look at two simple examples:

If you walk into Walgreens and buy a pack of gum you have the very visible choice of joining any of at least 2 or 3 savings programs, give money to starving children or just registering for future bonuses. In the physical world this is a very clear and conscious choice that most people (including myself) decline or accept based on our personal preference.

Simple, isn’t it?

However, the virtual world plays by a wholly different set of rules.

Every time you visit a website you are likely to be giving away a number of identifying factors whether you know it or not. And should you happen to actually purchase something, you are leaving yourself at the mercy of the capitalistic virtual demigods. Not only are you giving away your credit card number, address, zip-code, purchase preference, delivery preference and phone number, but very likely a massive amount of aggregate information stored in cookies from other purchases and visits that you have made. So what’s the difference?

In 1995 I would have totally understood this process. The Internet was a vast wasteland, inhabited by porn and pop-ups, and ruled by unscrupulous characters (no need for student loans, thank you very much).

Even in 2000 the Internet was mostly an unregulated territory where spammers could roam free and ‘Adwords’ was an instant success story (again, thank you). But now? What gives ‘online’ the right to work under a different set of rules and regulations then regular ‘IRL’ commerce?

Opt-out by default should be the standard.

Companies (yes – I am looking at you Google, TradeDoubler, Yahoo, etc…) collecting personal information should be on a ‘default is NO’ basis. Not only because this practice is borderline illegal in many cases but – and much more importantly – it undermines the very nature of consumer confidence. Thus, it is time to kill ‘Online’ and start treating ‘online’ the same way we do everyday grocery shopping.

Commerce is commerce.

If you buy something ‘online’ or at your local store you should, as a consumer, be able to expect the same service, rights, privacy, and responsibility as you would in any brick-and-mortar store! Anything less and the the impact will remain consistent – people still thinking of the Internet as a less secure, less private and less safe purchasing option. And THAT is not good for anyone.

So let’s do away with the excuse that ‘online’ somehow differs from ‘IRL’ and just accept that whether you are face-to-face with your local grocer or 5,000 miles away you are still just buying a gallon of milk.

What people are saying about SpiderOak (Pt. 2)

We’re thankful for you. We said it Friday, and we aren’t quite done.

One of our favorite things is getting to know SpiderOak users. Our loyal customers and fellow privacy fanatics have continually helped us create a better product and develop and grow as a company.

Meet a few more users who were willing to share their SpiderOak story:

Angela, the mother of four:

Gary, the traveler:

“Thank you again for your organization’s fabulous gift of your ‘cloud,’ to me and the friends to which I’ve referred to your company. Your tool allows me to:

  1. Reduce my stress and feel safer because I know that my roughly 25 years of docs are backed-up more securely than on a memory stick or DVD (which I used to use).
  2. Save energy, time and frustration because I no longer need to remember to and, then, manually back up my docs and sync my computers.
  3. Have enhanced convenience and travel about Toronto more lightly without having to drag my laptop because I can access my docs wherever I am in the world via the net.

You won’t be surprised, given the above benefits that I receive from SpiderOak, that I have it loaded into my “Start-up File” on each of my desktops, laptop and wife’s computer so your [client] loads and runs automatically when I start to write. This enhances SpiderOak’s convenience even further for me.”

Christopher, the Linux, Ubuntu and Windows user:

“I am a person who has a certain amount of data I often need to access when I’m away from my workstation. Of course, I’m also concerned that just backing-up to my external drive could have risks. An excellent solution to these needs is the online backup service provided by SpiderOak.

What’s striking about SpiderOak is the elegance and simplicity of the product. A small, easy-to-setup application sits on my Ubuntu and Windows desktops and backs up the data I have specified when it’s created or changed. It’s wonderful that it just gets on with the job without needing my intervention. I can easily make changes to my backup set, check my backed up data, or see how much of my free online storage I have used – which, by the way, is a really attractive feature with 1GB being allocated for each friend referred. I often setup networks and PCs for friends, etc, and I have found that SpiderOak is a great tool for storing useful files in a central location that I can access any time I need to. By using the simple web interface and downloading them I save myself the bother of constantly remembering to have to load a USB stick to carry them around. The same advantages apply when I setup new Linux distros on my own machines.

I’ve also found SpiderOak a great tool to enhance security while travelling. I often travel in Europe on business and previously would carry a laptop or USB containing sensitive data. Yes, it was encrypted, but it gives me much more reassurance that I can now simply access that data securely from my company’s network by logging into my secure SpiderOak account.

Last, but certainly not least, I really like how SpiderOak comes across. The website is simple to navigate and tells me all I need to know. There’s also an element of humour there and in the Twitter feed from @SpiderOak. I really appreciate this. Although I’ve never had a problem with the product, I have the feeling that although SpiderOak people take what they do very seriously, they don’t take themselves too seriously. It’s refreshing to discover a business that is not laden with empty marketing-speak and actually comes across as if it has pleasant human beings working for it who are interested in what I as a customer need. It must be a great place to work! Thank you for an excellent service.”

What about you? What has been your SpiderOak experience? Leave a comment below.

The Risk to Your Encryption Keys when Using Virtual Hosting

Dan Goodin over at Ars Technica has a nice article with an example of one of the privacy risks of using virtual hosting (such as Amazon EC2 and other cloud computing services.) This particular scenario allowed attackers to recover GPG keys from other virtual machines that happened to be running on the same physical machine. It’s likely possible to recover SSH keys in a similar way.

Since a few customers have asked, SpiderOak owns and operates all of its own physical hardware. None of it is virtual hosting with other organizations.

What people are saying about SpiderOak (Pt. 1)

One of our favorite things is getting to know SpiderOak users. Our loyal customers and fellow privacy fanatics have continually helped us create a better product and develop and grow as a company.

We’re grateful. Allow me to introduce you to a few users who were willing to share their SpiderOak story:

Kevin, the professional musician:

Brook, relieved to have photos and videos safely backed up:

“I just wanted to drop a quick note that I recently started using SpiderOak for my backup and syncing needs, and it works great. I can vouch for it working well on Windows, Linux and the Android app. I love all the flexibility you have with [the client] and really appreciate the ‘zero-knowledge’ data encryption. Over the weekend [my family] organized and backed up about 10GB of photos and videos. It’s a huge relief to have that taken care of. [I really appreciate] SpiderOak’s sync feature for keeping my main documents synced between my desktop and laptop. I’ve contacted support a few times for general questions and always received quick, personal and useful responses. So far, I’m very pleased.”

T3charmy, left Dropbox for SpiderOak:

“I left dropbox for you guys. I had a promo for 5GB free, and so far, you guys are WAY better than Dropbox. My experience… has far exceeded what Dropbox could do. The one thing that I would like to see is the ability to upload files from the Android app. Other than that, you have far exceeded my expectations. 6/5 stars.”

We’ll share more testimonials next week, as well as promote a limited special deal for 25GB we’ve never before offered. Stay tuned!

In the meantime, what about you? What has been your SpiderOak experience? Leave a comment below.