[vox-tech] My wife's website

Rick Moen rick at linuxmafia.com
Fri Jan 12 09:25:43 PST 2018


Quoting Alex Mandel (tech_dev at wildintellect.com):

> I outsource to Wordpress.com, just pay the $15 a year to use a custom
> domain. I figure if the main vendor behind the software can't keep it
> patched and safe, no one can.
  ^^^^^^^^^^^^^^^^

Quoting from Marcus Ranum's[1] 'The Six Dumbest Ideas in Computer Security',  
http://www.ranum.com/security/computer_security/editorials/dumb/ :

  #3) Penetrate and Patch

   There's an old saying, "You cannot make a silk purse out of a sow's
   ear." It's pretty much true, unless you wind up using so much silk to
   patch the sow's ear that eventually the sow's ear is completely replaced
   with silk. Unfortunately, when buggy software is fixed it is almost
   always fixed through the addition of new code, rather than the removal
   of old bits of sow's ear.

   "Penetrate and Patch" is a dumb idea best expressed in the BASIC
   programming language:

   10 GOSUB LOOK_FOR_HOLES
   20 IF HOLE_FOUND = FALSE THEN GOTO 50
   30 GOSUB FIX_HOLE
   40 GOTO 10
   50 GOSUB CONGRATULATE_SELF
   60 GOSUB GET_HACKED_EVENTUALLY_ANYWAY
   70 GOTO 10  

   In other words, you attack your firewall/software/website/whatever from
   the outside, identify a flaw in it, fix the flaw, and then go back to
   looking. One of my programmer buddies refers to this process as "turd
   polishing" because, as he says, it doesn't make your code any less
   smelly in the long run but management might enjoy its improved, shiny,
   appearance in the short term. In other words, the problem with
   "Penetrate and Patch" is not that it makes your
   code/implementation/system better by design, rather it merely makes it
   toughened by trial and error. Richard Feynman's "Personal Observations
   on the Reliability of the Space Shuttle" used to be required reading for
   the software engineers that I hired. It contains some profound thoughts
   on expectation of reliability and how it is achieved in complex systems.
   In a nutshell its meaning to programmers is: "Unless your system was
   supposed to be hackable then it shouldn't be hackable."

   "Penetrate and Patch" crops up all over the place, and is the primary
   dumb idea behind the current fad (which has been going on for about 10
   years) of vulnerability disclosure and patch updates. The premise of the
   "vulnerability researchers" is that they are helping the community by
   finding holes in software and getting them fixed before the hackers find
   them and exploit them. The premise of the vendors is that they are doing
   the right thing by pushing out patches to fix the bugs before the
   hackers and worm-writers can act upon them. Both parties, in this
   scenario, are being dumb because if the vendors were writing code that
   had been designed to be secure and reliable then vulnerability discovery
   would be a tedious and unrewarding game, indeed!
   [...]


Your WordPress has never been _safe_, merely because it got patched.  
And patched.  And patched.  And patched.  As Ranum points out, if the
past security work had been sufficient, it wouldn't have been necessary
to subsequently keep fixing the same code modules' security breakdowns
in the same places over and over -- the sure mark of fundamentally bad
code that never gets actually fixed.

IMO, the only appropriate remedy for fundamentally bad code (like
public-facing PHP itself, not to mention WordPress) is to cease
using it.

A few years ago, after repeated PHP security problems, I banished all
public-facing PHP from my linuxmafia.com site by converting all pages
that relied on it to any of several means of serving static HTML,
instead.  This turned out to be a good thing from several perspectives,
including the discovery that several site features never needed to be
assembled dynamically by a PHP interpreter at page load time in the
first place, but were implemented that way solely because of coder
(including my own) laziness.


> The other route to go, is to switch to a static site generator
> https://www.fullstackpython.com/static-site-generator.html
> Many of which are blog oriented.

Word.

I certainly include myself in the 'coder laziness' category, e.g., my
personal FAQ pages had been designed by yr. humble servant as a series
of PHP include directives (for the header, footer, table of contents,
etc.) for no better reason than that being easier than thinking.  Five
minutes' pondering yielded the obvious alternative of building the pages
using GNU make.

http://linuxmafia.com/~rick/faq/
http://linuxmafia.com/~rick/faq/Makefile

Other pages such as BALE (http://linuxmafia.com/bale/) turned out to be
easily generated using PHP interpreter /usr/bin/php5 locally in periodic
cron jobs to generate static HTML pages, i.e., they never needed to be
dynamic, just periodically generated.

So, with a modest bit of rethinking and revisiting implementation
approaches, I got better security, better performance, better
reliability.  Pretty good deal.


[1] Noted BSD security expert, architect of the original TIS Firewall
kit, author of one of the first high-security ftp daemons, etc.  The
quoted excerpt might seem like Ranum at his most sarcastic, but that's
nothing compared to when he revisited the same theme, here:
http://www.ranum.com/security/computer_security/editorials/master-tzu/



More information about the vox-tech mailing list