There sure is a lot of crap on the internet, and Oracle has it’s fair share. So how do you assess the quality of a piece of advice or a technical article? Here’s some of my own thoughts.
- Good Sign: Lots of “ifs, buts, and caveats”. Very few features of a technology are straightforward good news, and there’s nearly always a downside to any optional feature or technique — otherwise it wouldn’t be optional, I suppose. For example, an article that stated outright “index-organized tables give better performance” is leading you astray, because while key-based retrieval may be faster, there is an overhead on data modification that you ought to be aware of. If an article doesn’t tell you about the bad news, then look for one that does.
- Good Sign: an explanation that is rooted in documented behaviour. There may be features and techniques in Oracle on which you can find no documentation, either here or in Metalink, but in my experience they are few and far between. The Oracle documentation is excellent and the best articles are those that supplement and expand on information contained therein.
- Good Sign: the ability to question the author. Can you send an email, or a forum PM to the author asking for clarification? If so what sort of response do you get?
- Good Sign: edits. If the article contains something like “Shortly after publication I received an email from … pointing out that … and in my opinion this is/is not a valid point”. An author who is willing to acknowledge a mistake and credit the person who corrected them, or who is willing to acknowledge that their information was challenged, is probably writing better articles.
- Bad Sign: $$$’s required. If you get a brief introduction or rave reviews of a technique, and are then referred to a book or service you have to pay for in order to get more (useful) information, then I’d be steering clear. You absolutely would not believe how much nonsense has been published about Oracle in the last decade, nor how much of it is just a re-phrasing of documentation.
- Good Sign: practical demonstrations. There are some features that it are difficult or impractical to demonstrate, but you ought to be able to recognise those situations. Speaking as a guy who develops and tests on a small system for later test and deployment on much larger hardware (on a completely different operating system as well) it is my opinion that if you are careful then lessons learned in development can be successfully integrated into the larger scale system. Even if I have a single CPU at my disposal I can infer a great many useful lessons about how parallel query works on an eight cpu system.
- Ambivalent: the author. There are obviously some authors who I am going to trust more than others, but I try to be equally cynical about everyone. I’ve made enough mistakes in my professional life, and seen enough by others, not to expect perfection from anyone. This leads us back to my fourth point of course … if I have never seen an author admit to a mistake and acknowledge it publicly, then that’s a big red flag. so maybe it would be more fair to say that I am cynical about everyone, but more cynical about some people in particular.
I expect that I’ll think of more, so I’ll repost if anything comes up … or if I get other suggestions (full credit promised!)