When I hacked the SecuStick, one of the rebuttals of the manufacturers was: 'This kind of hack is too difficult: 95% of the people can't crack it!'. There are two problems with that kind of reasoning. The first problem is that 5% is enough: it means that if you know 19 people and have stolen or found such a 'secure' stick, you can get the one guy who does have the knowledge crack it for you. This way, the contents of such a stick are available to many more than only 5% of the human population.
The second problem with these kinds of hacks is that to get access to the information on such a 'secured' stick, one has to make modifications to the software that runs on the PC. As we all know: software is copyable, and the same goes for hacked versions of this software, which e.g. accept any fingerprint as correct. If one was to create such a software-package and distribute it over the Internet, every person with a spot of Google-knowledge can get to the info in the stick, and that would mean the 95%-figure who aren't technical enough to do the hack itself would go down to the amount of people who can't use Google. That number is a lot smaller.
As a final thing: The SecuStick-review generated some comments like 'Why is that routine called VerifyPassword? That's waaay to obvious!'. True, the code could have been obfuscated by its creators. It wouldn't be much of a deterrent, though: the binaries that check the passwords or fingerprints I hacked are actually quite unprotected and transparent. They look a bit like the earliest copyright protection schemes on software and games: easily hackable if you have access to a debugger. Nowadays, software and games have really obfuscated types of copyright protection: multiple checks, encoded executables, anti-debugger measures, you name it. Did it stop cracked games and programs to come out? Not at all. Same thing goes here: obfuscating the code can delay the crack a bit, but the software will be cracked eventually.