Python is a great programming language, and I use it to write almost all of my tools. But even the best tool cannot protect you from hurting yourself when you don’t know all its edges.
Python has three ways to execute your code for you: You type up a script, and let python run it (python script.py or ./script.py), you start the interactive python console (or use dev-python/ipython which is really neat) or instruct python to run a specific command via an argument (”python -c ‘print 17′”). Now, in the interactive case you often have python files lying around in your current working directory, and want to import them and then test a function. For this reason the current working directory is the first element in the module search path, in Python terms: sys.path = ‘.’. Obviously, this would be a huge mistake when asking Python to run some file; you would not expect virt-manager to load its dependencies from /tmp just because you started it there. The last option, however is the corner case here: The python developers went with the way the interactive shell works, and so this happens:
rbu@peanut /tmp $ echo 'print 1' > re.py
rbu@peanut /tmp $ python -c 'import re'
This is not an issue in itself, as it is documented, but it certainly is something you should note when writing scripts that people are supposed to run on multi-user systems. If your shell script in /usr/bin calls “python -c” and people run the script from /tmp, they might end up executing code from Python modules a local attacker had placed there.
And that is how today, we released GLSA 200810-02 for bug 239560, a local root vulnerability “in” Portage. But in the end, it’s not even Portage’s fault. Several ebuilds (among them the ebuild for Portage 2.1 itself) used “python -c” and Portage does not change the working directory when it executes the ebuild’s bash functions. And judging from the ebuild API specification, it does not have to: The ebuilds are the ones that need to make sure Python does not include the current working directory (e.g. export PYTHONPATH). But even those rules are not written in stone, and I hope we bring forward a change of this contract.
So, if you own or distribute any shell scripts that interact with Python, please make sure you keep your Python in its cage. Oh, and check your usage of urllib2.urlopen() while at it.
According to the author the $35 Windows program Folder Lock is “a fast file-security program that can password-protect, lock, hide and encrypt any number of files… Protected files are hidden, undeletable, inaccessible and highly secure”. It even works on “USB Flash Drives, Memory Sticks, CD-RW, floppies and notebooks.”
Now while I still wonder how they protect files from deletion on USB sticks, Charalambous Glafkos found out that the password to encrypt the files is stored in the Windows Registry. For maximum security it is reversed and encrypted with ROT-25.
OpenSSH 5.1 is out, and besides a Security issue that does not affect Linux or the BSDs, it includes a new feature labelled VisualHostKey, aka SSH Fingerprint ASCII Visualisation. Using an idea proposed in the 1999 paper Hash visualization: A new technique to improve real-world security by Perrig and Song, an image with 18×9 resolution is generated from the fingerprint of the SSH server, and is displayed to the client.
Since the feature is experimental, and the algorithm to generate the image should not be considered final yet, display is disabled by default. You can see a test-run in the screen capture, and a (just for fun) list of images of my known hosts. I wonder how long it takes to remember that face… doesn’t it look like bit like Marge Simpson?
Now why all this, you are asking?
It is deemed that images are easier to compare and remember than the usual 32 hex digits, and I believe everyone has to judge by him/herself if that is true. How many of those SSH/OTR/SSL… fingerprint digits do you check*? All of them? Any, at all? Where did you derive your latest Firefox SSL CA certificates from? At a time where I cannot trust my provider to run a secure DNS server, verifying the authenticity of either the other side of communication, or the data in transit is most crucial. Let’s finally get that Tree Signing going!
* If you only check the first 4 digits, and the last 2 — you are riding on a 24 bit fingerprint.
If you run any kind of server, especially Debian or Ubuntu, or grant users access to your server, you might want to read the Debian Security Advisory DSA-1571-1 or Ubuntu’s Security Notice USN-612-1 for CVE-2008-0166, and check your encryption keys:
It is strongly recommended that all cryptographic key material which has
been generated by OpenSSL versions starting with 0.9.8c-1 on Debian
systems is recreated from scratch. Furthermore, all DSA keys ever used
on affected Debian systems for signing or authentication purposes should
be considered compromised; the Digital Signature Algorithm relies on a
secret random value used during signature generation.
The first vulnerable version, 0.9.8c-1, was uploaded to the unstable
distribution on 2006-09-17, and has since propagated to the testing and
current stable (etch) distributions. The old stable distribution
(sarge) is not affected.
Affected keys include SSH keys, OpenVPN keys, DNSSEC keys, and key
material for use in X.509 certificates and session keys used in SSL/TLS
connections. Keys generated with GnuPG or GNUTLS are not affected,
This vulnerability is caused by a patch shipped in Debian, Ubuntu, and other derivatives. Gentoo’s OpenSSL version is not affected, but everyone should check user-provided public keys (such as OpenSSH’s authorized_keys) using the Debian/OpenSSL Weak Key Detector.
Update: Ben Laurie of OpenSSL is making a point that Vendors Are Bad For Security, which I would not follow in that general form. What I have to grant him: Mechanisms of peer review must be employed properly and patches discussed with upstream. If you follow this philosophy, Vendors Are Good For Security.