Helping people with computers... one answer at a time.
It's not uncommon for folks to ask why computer systems seem as fragile and as vulnerable as they so often do. It's a legitimate question.
My question is perhaps more of an industry one than a personal computing question. Because malware, viruses, spam and the similar user-beware problems affect just about everyone who uses the ‘net for their daily informational needs, why hasn’t the technology industry tackled these issues head on? These are the problems that ultimately affect the non-computer savvy general user the most devastatingly.
Perhaps the question can be simplified: On the foreseeable horizon, will there be a time when users will not have to worry about viruses and malware? And why can't computer developers simply make one that is virus-free now?
Are there existing machines, platforms, etc, which can affordably take the risk out of using the internet? It just seems that no matter how careful one is or what virus software they use, the "bug" eventually gets them and huge problems ensue. You would think that the profit potential would be so significant that the developers out there would be jumping all over this opportunity - the bug-free system.
You're actually asking two separate questions:
Is it possible to create or write bug-free software?
Is it possible to create a computer system that is impervious to malware?
The practical answer to both is, unfortunately, no.
It sounds really simple: if we just wrote software more carefully, used better tools or techniques, or hired better programmers, we should be able to get rid of every possible bug, right? No mistakes. Ever.
There is no such thing as bug-free software. Period.
Yes, some software is better or worse than others, but as an absolute measure, no software ever reaches perfection.
There are three problems at play here: complexity, time, and functionality.
What most people fail to grasp is the incredible complexity behind most of our computer systems today. It's truly mind-boggling to think of the thousands, if not hundreds of thousands of man-years of effort that have gone into getting any your computer to boot and run effectively. (I'm being OS-agnostic here. I don't care if it's Windows, Mac or Linux - they're all incredibly complex beasts.)
People that understand are amazed that they work at all. I know I am.
Make it less complex? Well, that means making it do less, be capable of less, and be less functional.
Whatever you decide to cut out is important to someone. I don't care which feature you hate the most and would love to see cut completely from the next version of whatever product you care to name. There's someone, perhaps lots of someones, who care deeply about that feature and would be incredibly upset to see it removed.
Computers are general purpose devices and people expect computers to be capable of many things - even many things that haven't been thought of yet.
And that leads to incredible complexity.
So why not just take more time to get it right?
There's a strong argument for that, and you'll often see difficult decisions being made throughout the life of a software project, jettisoning features and functionality so that more time can be spent on getting what remains correct. Or you'll see projects take longer than planned because of the extra time that it required to meet a minimal quality bar.
But the practical reality is that software that never ships does no one any good. At some point, a trade-off has to be made between spending more time developing software or deciding that it's good enough, knowing that it will never, ever be perfect.
It's not that the people working these projects are stupid - far, far from it. Writing today's intensely complex systems in a way that meets everyone's expectations in a reasonable amount of time is hard. Very hard.
It's not an excuse, it's a reality. And the reality is that mistakes will be made.
As I said, computers are general purpose devices. We use them to do an amazing variety of things simply by loading different software. When you think about it, it's pretty magical.
So, tell me this: what is malware?
Seriously, how do you define a strict set of rules that defines what software can do that is "good", and what it should never, ever do because it's "bad"?
Sure, some things are obvious, but that's not the point. The point is the grey areas.
Just about any activity that you can think of as being malicious can also be viewed from a different perspective as being potentially useful. Consider for a moment Data Execution Prevention (DEP). Being able to execute data as if it were a program can be a useful programming technique; look at how many programs break when preventing that ability is enforced. And yet, it's enforced because it's a common vector for malware.
I'm certainly not saying that there aren't ways to make things better than they are. I'm sure that there are additional improvements or rules that, along the lines of DEP, might break things for a while, but would ultimately result in a more secure environment.
What I am saying is that short of turning your computer into a device which cannot be programmed at all, there is no way to prevent malicious software in any absolute sense.
As long as there are bugs (and there always will be)...
As long as there are folks with malicious intent (probably also always will be)...
As long as we can be fooled into running software with malicious intent ...
As long as we can't limit what computers might be legitimately expected to do ...
Malware will be with us.
And for the record: I'd love to be wrong. Truly.
I just don't see it happening. At least, not in my lifetime.
Comments on this entry are closed.
If you have a question, start by using the search box up at the top of the page - there's a very good chance that your question has already been answered on Ask Leo!.
If you don't find your answer, head out to http://askleo.com/ask to ask your question.