(1 item) |
|
(1 item) |
|
(5 items) |
|
(1 item) |
|
(1 item) |
|
(2 items) |
|
(2 items) |
|
(4 items) |
|
(1 item) |
|
(6 items) |
|
(2 items) |
|
(4 items) |
|
(1 item) |
|
(4 items) |
|
(2 items) |
|
(1 item) |
|
(1 item) |
|
(1 item) |
|
(1 item) |
|
(1 item) |
|
(1 item) |
|
(1 item) |
|
(1 item) |
|
(2 items) |
|
(2 items) |
|
(5 items) |
|
(3 items) |
|
(1 item) |
|
(1 item) |
|
(1 item) |
|
(3 items) |
|
(1 item) |
|
(1 item) |
|
(2 items) |
|
(8 items) |
|
(2 items) |
|
(7 items) |
|
(2 items) |
|
(2 items) |
|
(1 item) |
|
(2 items) |
|
(1 item) |
|
(2 items) |
|
(4 items) |
|
(1 item) |
|
(5 items) |
|
(1 item) |
|
(3 items) |
|
(2 items) |
|
(2 items) |
|
(8 items) |
|
(7 items) |
|
(3 items) |
|
(7 items) |
|
(6 items) |
|
(1 item) |
|
(2 items) |
|
(5 items) |
|
(5 items) |
|
(7 items) |
|
(3 items) |
|
(7 items) |
|
(16 items) |
|
(10 items) |
|
(27 items) |
|
(15 items) |
|
(15 items) |
|
(13 items) |
|
(16 items) |
|
(15 items) |
Once I had got to the bottom of a problem I was having recently with the CryptoAPI, I realised that I may have become accustomed to trusting default values a little too much.
One of the reasons I prefer using the .NET Framework to Win32 is that .NET APIs tend not to require me to
pass in great long sets of parameters - they usually provide simple overloads that supply sensible defaults. So when I
have to return to Win32, I am now too ready to pass in NULL
whenever I'm allowed to. This was how I
came to make the mistake of allowing the CryptoAPI to choose a default key length for me, as described in the entry
linked to above. (The key length/effective key length schism is an entirely different matter - that's just an unhelpful
API quirk; I'm glad it has been fixed in XP/2003 even though this change was what caused the problem to manifest
itself.)
But this got me thinking about the wisdom of relying on default values. If it's going to get me into trouble like this, perhaps I shouldn't be so ready do to it.
But why does the Win32 Crypto API offer to supply defaults if the resulting behaviour is going to be inconsistent in different platforms, or even in different environments on the same platform? By allowing the CryptoAPI to pick default values, I am saying 'Use whatever the local settings are; I am aware that what I encrypt with these settings might fail to decrypt in a different environment'. Once you understand that this is the implication of using the defaults, the behaviour you see makes sense. But this is not necessarily what developers will expect - I tend to assume that if I can ask for a default to be supplied for me, the default will be benign. In this case, it wasn't so I got a nasty surprise.
If defaults might not make sense, then I don't think that they should be defaults. If the system cannot determine
non-surprising defaults for me in a safe way, then I should be required to set such parameters explicitly. In my
CryptoAPI example, it would have been better if the key generation APIs simply failed when given a key length of
zero, rather than picking a context-sensitive default. If you want to use a potentially non-portable environmental setting,
that should require an explicit step, rather than being a default. For example, rather than passing in NULL
to get the default crypto service provider, it would be better for there to be a special value you can pass in to indicate
that you want whatever the local environment's default provider is - that way you are opting into
environmentally-sensitive behaviour.
I realise that there's a tension here. Simplicity is a desirable property of an API, and one of the ways to make an
API simple is to have it provide intelligent defaults. But I think it's important for defaults to be non-surprising. This is
especially important for .NET APIs, where you might not even see any evidence that some default value or behaviour
has been selected on your behalf. With Win32 APIs, passing in a value of zero or NULL
acts as a cue
that something might be decided on your behalf. (Although I came unstuck with the key length thing because the key
length is squished into the top 16 bits of a flags parameter, rather than getting its own parameter...) With .NET, you don't
get this cue. An API with optional defaultable inputs tends to offer a range of overloaded constructors and
properties allowing values to be set in scenarios where you don't want the defaults. This means that it's not remotely
obvious from the code that the value for which a default is being supplied even exists. Consider this example:
StringFormat sf = new StringFormat();
How many parameters just got set to default values for me there? It's really hard to tell. In fact one of the APIs
that accepts a StringFormat
object is Graphics.DrawString
,
and several of its overloads don't even take a StringFormat
, so it wouldn't necessarily occur to a
developer to ask the question in the first place. Drawing text is a surprisingly complex endeavour, yet the APIs can
make it look simple by providing sensible defaults for the myriad tweakable settings. In this case, this is A Good Thing,
because the defaults chosen are in fact what you want most of the time. Just as well really, since if you're new to
GDI+, you might not realise just how much is being done on your behalf when you see this:
protected override void OnPaint(PaintEventArgs e) { e.Graphics.DrawString("Hello, world", Font, Brushes.Black, 0, 0); base.OnPaint (e); }
So in .NET APIs, I think the principle of least astonishment is even more important than it is in C-style APIs like Win32. As long as APIs get this right, the code that uses those APIs can remain clean and uncluttered. (And of course, our code can provide explicit values when non-default behaviour is required.) But if the defaults have the capacity to surprise, then we're in a world of pain.