Sketch: An idea I have been kicking around

sketch

I have been kicking around an idea lately and thought I should capture some sketches of it.  I want to create a few sketches about this idea and use it to get feedback from people before I start committing to actual work.  I don’t spend enough time sketching my ideas out and I would like to fix that.

I used the Expression Blend SketchFlow feature to capture this in about 5 minutes.  I am have been reading about the full (proper) use of SketchFlow in the book Dynamic Prototyping with SketchFlow in Expression Blend by Chris Bernard and Sara Summers.  I used to work with Sara and Chris when I worked as an Evangelist for Microsoft DPE.

Website update for Safari Reader

One of the things that I have been doing as I have been updating my blog template is making sure that it works and looks good in the popular browsers: Internet Explorer 8, Firefox 3.6 and Safari 5 (yes, I need to test in Chrome as well).  In doing so I finally got to go hands on with new feature of Safari called Reader, which Apple shipped in June with Safari 5.  I had heard good things about Safari Reader from Dave Winer and heard it was the end of the world from Jim Lynch (who writes about advertisements and other things).

What is Safari Reader?

Here is the description of Reader from the Apple site:

With Safari Reader, you no longer have to deal with annoying ads and other visual distractions that get in the way of your online articles. That’s because Safari detects if you’re on a webpage with an article. Click the Reader button in the Smart Address Field, and the article appears instantly in one continuous, clutter-free view.

The best way to see what reader does is to look at a website with reader in use.  My new layout does not have a lot of flair to it, so I figured I would show you a couple screen shots from the blog of my buddy Dave Bost.  Here is one of his recent articles in Safari without the reader view enabled (note you can see the “reader” button in the address bar letting you know you can click it):

reader2Like many blogs, Dave has a header, some navigation, some advertisements, RSS icons and even a Twitter Badge.  Now click on the reader view and this is what you see:

reader3 It is the same article, but all the non-article content is suppressed.  It is hard to tell with the screen shot, but the text is also slightly bigger than the native size on the page aiding the readability just a bit.  As I understand it, if the article is multiple pages long it will pull the content from all the pages (I don’t use Safari as my day to day browser – so I really have not seen that in action).  One last thing to notice is that the Reader icon in the address bar is now a purple color, letting you know it is in reader view.

Have you taken a look at your website(s) in Safari Reader view?

Once I figured out what Reader did I was anxious to see what my site looked like with it turned on.  I figured that there would not be much difference, because my new layout is pretty minimalist (I am proud of that – can you tell?).  So I fired it up and here is what I saw:

reader1 As I figured, there was not too much of a change, but then I looked carefully and noticed something missing.  Did you catch it?  Here is a hint:

reader1a Reader had dropped my title, dismissing it as Flair!  It only took me about a minute of digging to realize the problem and I discovered it in my own markup.  This is how the blogging engine and template were rendering the title of the page (I did shorten the href for display):

<div class="itemTitle">
<a class="TitleLinkStyle" rel="bookmark" href="/MeetTheNewBlog.aspx">Meet the new blog…</a>

</div>

I had not marked the title of the post with an <h2> element, but I had used an <h1> element for the title of the site as well as <h3> elements in the body of the article.  Reader is using heuristics to “guess” your markup and heuristics can be quite good, but never perfect given the variability in markup across the Internet (and the Internet is pretty big from what I have heard).  So I updated the template and the corresponding css for layout to add an <h2> element like so:

<div class="itemTitle">
<h2><a class="TitleLinkStyle" rel="bookmark" href="/MeetTheNewBlog.aspx">Meet the new blog…</a><h2>

</div>

Here is what the page looks like with the updated mark-up.  Much better, but you will notice that Reader has now dropped the title of the blog; I am okay with that:

reader1b

Checking my site for how it looked in reader reminded me of an important lesson about the proper use of HTML Heading elements.  <h1>, <h2> and all their friends are more than just for styling, they are giving semantic information about your page.  It is not just that your element look like a heading (you can do that with any old <div> or <span> element), but it should match the use on the page.  It was just plain silly for me to have an <h1> and <h3> and not have a corresponding <h2>.

Hope this helps….

Meet the new blog…

I started quietly working on an update to this blog last week and I hope you like the changes.  If you are reading this in your feed reader, take a moment to go check it out on the site at http://eraserandcrowbar.com.

Still in Beta

I am not 100% done with the changes that I wanted to make, but the bulk of the plumbing is done.  I really felt that it was a good time to put out the updates and ask for feedback on what I had done so far.  I expect to make minor tweaks and I have no issues doing that in public.

Does it look familiar?

You might have stumbled on a project called “the setup” that is interviews with a bunch of technologist about the equipment that they use to get the job done.  My new site template was inspired by that site (and a bit of the CSS is taken directly from the site).  The creator of that site shared it under a creative commons license and the attribution is on every page of my template.  Lots of love for sharing your work!

Special thanks goes to Matt Gauger who turned me on to the setup and has created his own version of the setup that profiles Wisconsin technologist.  Matt, my answers are done, I just can’t find a picture that I like!

Other changes

In addition to the obvious visual changes I made a number of “behind the scenes” improvements, including:

  • Fixed links that were pointing to http://larryclarkin.com so that they 301 redirect to http://eraserandcrowbar.com.  So pretty much any link on the internet will now find the original post.
  • Fixed internal links to point to a common location and do so in a relative fashion.  This will make back-up and changing hosting providers much easier (not moving, but you never know!)
  • Dropping tables from most of the site; there are still a couple in the formatting of the comments section from the dasBlog software, and I am debating about what to do with those.
  • Cleaned up a number of validation errors that were caused by a combination of the blogging software, original template and my terrible markup skills.

I will be following up with a couple of blog posts on some of the things that I discovered while making the updates to this site, because I think you might find them useful.  Just for reference here is a screen shot of the previous look and feel:

image

We need our stinkin’ badges

Badges? We ain’t got no badges. We don’t need no badges! I don’t have to show you any stinkin’ badges!”
-“Gold Hat” (Mexican bandit leader) from 1948 film The Treasure of the Sierra Madre

I recently signed up for Foursquare, which is a “location based social network service” that has been around for a little over a year.  I joined it for a couple of reasons: 

First reason: I have always been fascinated by mapping.  When I was younger I used to spend hours at a time just looking at the maps that we had gotten from National Geographic magazines.  We also had decorative maps from the Caribbean hung around our house (I actually lived in Puerto Rico as a child).  

Second reason: one of my summer projects (that I will share on the blog as it progresses) involves doing some coding with location based data.  I saw tracking my location via Foursquare as a great way to build up a nice set of location based data without having to do anything I was not already doing.  Foursquare, in addition to having the data on their website and accessible to applications on your mobile phone, it also makes the data available to you in other ways.  You can get your location “Check in” history in:

In addition to these simple formats, they provide a robust REST based API that outputs XML and JSON.  This allows developers to build cool data visualizations and interesting queries against the data.  It also supports a standard authentication protocol, so you can control which application developers have access to your data (more on OAuth some other time).

What keeps me coming back

So that is why I joined Foursquare, but what has really got me hooked on this service is their implementation of a reward based system of badges.  When accomplish certain tasks or reach certain plateaus of activity you are rewarded with a badge on your account.  Here are a couple of the ones that I have been awarded (see the Complete List):

Gym Rat

Gym Rat – Checking in at a gym more than 10 times within a 30 day period.  As a note, only a technology company would consider 10 visits a “Gym Rat”.


SwarmSwarm – Checking in a at a venue with 50+ other people.  In my case I was at a baseball game with 43,000 other people.

I love the natural competition of earning badges and being able to proudly display them (although I don’t think anyone looks at my badges).  When you combine the badges with the foursquare system of naming a “mayor” to each venue it becomes almost a game and can cause you to do unnatural things.  The other day my wife and I were going to dinner and I picked a local restaurant just because I was close to becoming the mayor of it.  I am currently on a quest to become the mayor of 10 venues at once, because there is a badge for that!

Not a new idea

Frequent visitor cards, where you get a free sandwich for buying 9 sandwiches, have been around forever.  And companies have had employees of the month/year just as long.  It is neat seeing the concepts applied to computer systems.  A couple other examples:

  • The Xbox 360 launched a reward based system of badges (called achievements) and points that have game players playing games longer
  • Stackoverflow has a reward based system to incent the answering of programming questions

Enterprise Achievements?

I talk with customers frequently about how to spur adoption of web based concepts in their companies.  We all know the value of blogs, wikis, group chat, social tagging and other concepts.  But often when companies try to apply these technologies the adoption is slow.  We, as an industry, know from experience that it is just a matter of getting enough interest going to get it to a tipping point where the idea takes off on its own.  Maybe we could apply a badge based system of rewards to spur early interest in adopting the technologies.  Here is a mockup of a badge that I built with help of the achievement generator over at http://achievements.schrankmonster.de/:

achievment

Common Screen Resolutions

I am currently working on a couple of web-based projects that involve adapting the content that is displayed to the dizzying array of screen resolutions that are available on today’s desktops, laptops and mobile phones.  I started making a list of the screen resolutions and thought I would share it:

Mobile phone resolutions (commonly rotate):

  • 320×480 (iPhone 3)
  • 480×800 (Nexus One/HD2)
  • 640×960 (iPhone 4)

Full Screen resolutions:

  • 640×480 (old school)
  • 800×600
  • 1024×768 (many projectors)
  • 1280×1024
  • 2048×1536

Widescreen resolutions:

  • 1280×800
  • 1440×900
  • 1600×900
  • 1680×1050
  • 1920×1080
  • 2048×1152
  • 2560×1600

This list is by no means exhaustive, but I hope it captures enough of a baseline that I can use to “unit test” the projects that I am working on.  Please let me know if I have missed any obvious ones!

Best Viewed in 800×600?

In the late 1990s and early 2000s we went through a period where many web developers and designers went with a standard of “locking” their website designs to screens that were 800×600.  In many ways this made sense because (at the time) that was the most common screen resolution and monitors that were slightly larger would still look okay. 

Go to your favorite search engine and do a search for the text +”best viewed in 800x600” (link goes to bing.com).  You will find that there are still 1000s of sites with that text on it (and a site that is locked into a design that is optimized for that display.

Since it was in vogue to lock a design to one screen size a number of things have happened:

  • Desktop screen resolutions got bigger – much bigger!  The 30” Apple Cinema display checks in at 2560×1600!  Laptops also got higher resolution
  • The variety of resolutions also increased with Widescreen Resolutions becoming popular
  • Smart phones became popular and we started browsing on devices that were much lower than the desktop and laptops
  • Browsers (and their associated standards) made it easier to have websites that could adapt to different size resolutions (yeah!)

All these factors have conspired to make locking your site into one screen resolution no longer a desirable practice.

The most important resolution to optimize for?

The ones that your users are actually using!  Knowing your audience is important especially before you spend a lot of effort in adapting your site to different screen resolutions.  Most stats applications (either server side or client side) will give you some statistics on the screen resolutions that are typical on your website.  As an example, if you have a site that is geared to information on mobile phones I would suspect that you would see more traffic from smart phones (smaller screen resolutions).  If you have a site that hosts high resolution wallpapers you probably see traffic with high resolutions.  You won’t know until you look!

Pete Prodoehl over at RasterWeb recently published some of his website stats, including the screen resolutions he sees on the site.  He noted that a couple of years ago he made some optimizations for netbook screens, but actually sees more traffic from Smart Phone size screens.

Don’t forget the firmware

Yesterday morning I got several updates delivered to my laptop via Windows Update.  These were not security patches that are typically delivered by Microsoft on the second Tuesday of the month (commonly referred to as Patch Tuesday).  But these were non-security related updates to the system.  An example of one was an update to the daylight savings functionality, seems that governments around the world are still adjusting the start dates (see  http://support.microsoft.com/KB/979306 if you want the details).

Seeing the updates reminded me that it had been a while since I checked my various devices for updates to the firmware.  This was a good reminder, so I spent a few minutes checking the devices and bringing them up to the latest versions.

Keeping your firmware up to date is just as important as keeping your operating system and applications up to date.  Firmware updates will often correct application issues, close security holes, improve performance and even deliver new functionality.

The IEEE has a very dry, but functional definition of firmware:

The combination of a hardware device and computer instructions and data that reside as read-only software on that device

I tend to think of firmware as this magical layer of software abstraction that has a very intimate relationship with the hardware, but I am kind of a software romantic.

Firmware is becoming increasingly important in computing as we have more and more intelligent hardware devices.  I bet the average person carries at least 1 device that has updatable firmware on it and many of us who are “mobile” workers can carry 3 or more (I have 3 on me right now: Zune, Mobile Phone and laptop).  Even the speaker dock that I have for my Zune has firmware in it.

Firmware updates are different

One of the things that is so nice about Windows Update is that it can be set to automatically update the system by the user (in a home situation) or by the administrator (in a corporate situation).  For the most part your updates are on auto pilot at that point, with the exception of getting prompted for a reboot if an update requires one.

Note: I don’t mean to be a Microsoft “fan boy” in praising Windows Update here; the Apple software update process is similar and just as reliable.  There are philosophical differences between the two; Microsoft’s updates are more numerous, but much smaller in size and Apple generally prefers fewer, but much larger updates.  There are also update processes for software applications other than the operating system as well: Apple Update for Windows, Adobe, Firefox, etc.

Updates to firmware are generally not as seamless and carefree as the Operating System Updates.  Almost no hardware systems pull down the updates and apply them automatically.  The processes by which you apply the updates also vary greatly:

  • The Zune and iPhone firmware updates are pulled down automatically and applied when you tether the devices, but they require software on the machine to accomplish it.
  • The BIOS updates on a computer usually require you to download an specialized installer that updates the firmware the next time you reboot
  • Home routers and access points generally have you download a binary file and upload it through the management interface

The slightly more complicated update process goes to the specialized nature of firmware, because it is at such a low level the update needs a little more care than updating a software application or even an operating system.

When was the last time that you updated your firmware?

Influential technologies of the last decade

With the coming of the New Year we see a lot of Top 10 lists.  You know like the Top 10 planning, design and development websites of 2009 or the Top 10 Quotes of 2009.  We are doubly blessed that since this is the “end of the decade” that we also get the top 10 lists for that as well.  You know like EPSN Boston’s Top 10 of the Decade (all Boston Teams) or Yahoo GamesTop 10 Video Games of the Decade (Super Mario Galaxy in the top 10?, really?).

Not to buck the trend, I decided to put together my own list of influential technologies for the years 2000-2009.  As you read this, please keep in mind the criteria that I used:

  • The technologies listed are not in any order
  • The technology did not have to be invented after 2000, but had to have reached wide spread adoption or a major turning point after 2000
  • I tried to avoid specific products or websites by name, but rather focused on the technology or the trend, rather than a specific implementation
  • I am strongly biased by my own personal experiences with the technologies, your experiences with them may be different than mine

MP3 Player Portable Music Players / Digital formats – It is not hard to see the impact of the portable music player on our society, just walk down the street and look at the number of people who have white ear buds in their ear. 

While the music player is obvious to see, what is not seen was the companion shift to digital distribution of content and the mind shift that we made with the change.  The digital music stores helped the music players to take off (although all indications are that most of the music does not come from online stores).

RSS Feed iconRSSReally Simple Syndication is probably the geekiest of all the technologies that I will list it.  This is one of the technologies that predates that 2000s, but saw wide adoption in the last decade; if had a blog or a website that published RSS before 2000, you should have a special badge to indicate your early adoption.  RSS is probably the third most popular document type on the Internet (behind HTML and CSS).  It is the best example of the power of a common data format.

People with laptops Social Networks early forms of social networking existed before the year 2000 (Yahoo Groups was one that I used to hang out in back in the day) and the concepts behind social networking even pre-dated the world wide web with people interacting on bulletin boards.  But again, it was in the last 10 years (actually 4 or 5) that social networking went from being a niche activity to seeing wide adoption.

The real impact of social networking is just now being felt as the “social” aspect expands from a casual activity that takes place out of work, to applying these principals to activities at work.  The overall trend of taking social technologies and applying them to the work place is called the consumerization of IT, and we will see it with a number of the technologies in this list.

Cellular PhoneSmart Phones – One of the things that the MP3 players mentioned earlier did was get us used to making out computing experience portable and taking it with us.  Going back to the 1990s we had Personal Data Assistants and cell phones.  It was natural to combine the two into one device and throw in the MP3 players as well. 

LAN CableBroadband – In August of this year Comscore released their latest estimates of broadband penetration in the United States.  The national average is now 89% of all Internet Users have some form of fast Internet access.  Personally I have had a cable modem for nearly 8 years, but I entered the year 2000 with dial up access. 

High speed access at the home was unusual in the 1990s; most people only had high speed access at their work place.  Now broadband access is becoming so ubiquitous that the people who develop websites and applications are starting to take it for granted.  By itself broadband access is a fantastic improvement, but like many infrastructure technologies, the real power of broadband is as an enabling technology that brings us other things (like streaming media).

TV on Computer Streaming Media – As I am writing this I have the television on in the background showing a movie.  The interesting thing is that it is streaming from Netflix in full High Definition quality to my Xbox using my internet connection.  There is no special magic about the Xbox; I could just as easily be streaming to my web browser or to any number of devices that support streaming.  Nothing special about Netflix either, I can stream from dozens of sites.  Contrast this with prior to 2000 when video on the web (when you could get it working) was of low quality.    

GPS Device in Car GPS – the Global Positioning System dates back into the 1970s from a military experimental standpoint and has been operational for civilian use since the 1990s, but this is one of the technologies that really took off in the 2000s.  The obvious adoption inside of the car was a first step, but now that many phones come equipped with GPS we are starting to see the real applications of location awareness.

Game ControllerGame Consoles – Game consoles are not new by any stretch of the imagination.  As early as 1978 I remember hanging out with my friend Charlie after school every day playing his Atari 2600 for 46 minutes (the time between us getting off the bus and having to turn off the console before his mother got home from work).  But the generation of the game consoles that launched with the original XBOX and the PS2 are really a different class of systems.  The modern game console is a hub of entertainment, with connections to social networks and streaming video.  Certainly games have comes a long way from Space Invaders.

reporter Social Media (Blogs, Wikis, Podcasts) – The last item is less about the technology and more about what it has enabled.  A vast reduction in production cost and a huge reduction in the distribution cost have led to the emergence of user generated content.  There are some that are saying that user generated content is replacing content from traditional media companies, but I look at the trend as additive; I still watch the evening news, but I have added social media to the mix as well.

I rather enjoyed putting this list together, but I am sure that I have missed a technology or two that is influential and would love to hear about the ones that I missed.  I will say that I intentionally left off search as a technology.  Search was clearly influential in the 2000s; however I think that it was established by the beginning of 2000.

Cost of Maintenance

The last time I was in New York City I got a fascinating history lesson about the Queensboro Bridge from the cab driver that took me from midtown to LaGuardia Airport.  He started off the lesson with a tongue in cheek statement:

It only took them 6 years to build that bridge 100 years ago, but they have spent the last 30 years just painting it.

I am fascinated by stories of complex construction like bridges, stadiums and unique skyscrapers and buildings.  So I really loved the cab driver telling me about the bridge and while he did, I got a lesson in the cost of construction vs. maintenance.

Initial construction – expensive and long

The history of the bridge starts long before the construction actually started.  Attempts to build a bridge that connects midtown Manhattan to Queens actually started prior to the Civil War.  The location of the bridge was ideal to span the East River, because of Roosevelt Island being located in the middle of the waterway.

After many years of private efforts failing the newly formed Department of Bridges of New York City started construction in 1903, after replacing the architect on the project.  The construction was plagued with delays, including the collapse of part of the bridge due to a wind storm and a labor unrest that nearly lead to the dynamiting of part of the bridge by upset laborers.  The bridge was finally completed in 1909 at the total cost of roughly 18 million dollars and the death of 50 workers during construction.

Maintenance – more expensive and longer

The bridge and its associated rail and tramway systems underwent a number of changes during its history as the street car system was decommissioned and the rail systems were updated to accommodate the growing city. 

In 1979 a project was commissioned to perform rehabilitation of the Queensboro Bridge.  30 years, six project phases and an estimated $300 million dollars later they are still working on the bridge.  The six different projects did more than just paint the bridge (but a lot of effort was expended on painting the bridge), but all of the changes did not change the fundamental structure of the bridge.

Software projects

I do not mean to belittle the teams working on the rehabilitation of the Queensboro Bridge.  I am actually quite impressed at amount of work that they are able to do while keeping the bridge open for business; it is easier to build a bridge than maintain it, because there was no traffic using it when it is being built.  I liked the contrasting story of the bridge’s construction and maintenance because it shows something that we deal with every day in the world of software development: It can be more expensive to maintain a code base than it was to develop it. 

If you work in software, you know what I am talking about.  There are lots of different estimates on the cost of software maintenance, but pretty much everyone agrees that the majority of IT budgets are spent just maintaining the systems that they currently have in house and that for every dollar you spend building or buying software you will certainly spend at least another dollar maintaining it (and I have heard that estimates for maintenance can go to 3 times the initial investment).

To put the Queensboro Bridge in perspective, I used the US Inflation Calculator and figured that the original cost of $18,000,000 in 1909 is about $269,454,545 in 1994 dollars (the midpoint of the 30 years of rehabilitation).  Making it a bit of bargain when compared to some software projects.

Notes: The inflation calculator does not go back to 1909, so I used 1913.  There are some fascinating photos of the history of the bridge on gallery of  http://queensborobridge.org/, brought to you by the Greater Astoria Historical Society and the Roosevelt Island Historical Society.

Paper Prototypes

017

Jodie and I went furniture shopping this weekend.  Our basement furniture is really starting to show its age (10 years) and it is time for a replacement set.  We found a really nice oversized couch and chair + 1/2 set at a local furniture store (BTW – I did not know that there was such a thing as a chair + 1/2).  The furniture seemed very well made, the style was nice and it was available at a very reasonable price.  We were concerned about the size of the pieces in the set; we have a nice finished area in our basement, but way the floor plan is laid out limits the area in front of the television where you would put a couch and a chair.

We could have bought the furniture and had it delivered and then started trying to figure out where to place the pieces, but I thought I would do some paper prototyping instead.  Using the measurements of the pieces that I got at the furniture store I used newspaper and tape to lay out the dimensions of the furniture.  It was then very quick and easy to figure out that the space was too small for the oversized furniture.  About $2.00 worth of supplies and 30 minutes of time avoided a potential fiasco with figuring out that we had bought the wrong furniture.

In software development we get really hung up on prototyping.  How many times have you been in a conversation about how all the work done up to this point was a “prototype” and should be considered “throw away” code?  I personally have seen prototypes carry on for months, which is not a good use of time or effort. 

I blame a lot of the problems we have with prototyping on the tools that we chose to prototype with.  Quite often we use a high fidelity tool like Photoshop, Visio or PowerPoint (and I am sure you can add to the list).  The rest of the time we use the actual development tools that we would use to build the finished solution like Visual Studio, Flex Builder, Expression Blend, etc.  It is no wonder that our non-technical partners are confused at why this is a prototype when it looks like a real application.

There are technology solutions that allow you to have electronic low fidelity prototypes; Expression Blend’s sketchflow and balsamiq are two that come to mind.  Those tools are fantastic, but you should try just using simple paper prototypes as well.  There is no more rapid prototyping than laying out screens and workflows using a few sheets of paper and a sharpie.

Note: My story about the furniture would not be complete unless I told you that Jodie and I disagreed on our concern about the size of the furniture.  We were both concerned that it was too big, but I was concerned about how it would fit in the room, Jodie was concerned that the couch would not be able to even get down the stairs.  It turns out that we were both right, but I got a blog post out of my paper prototyping.

Measure twice, cut once

imageLast night I was hanging a cabinet at home and I wanted to make sure that it was centered on the wall.  I took measurements of the wall size, the cabinet size and where the anchors would have to be on the back of the cabinet.  Then I did some addition and subtraction to figure out where I would have to drill the anchors and marked the holes on the wall; this is the point where you are really committing to the process – making holes in your wall.

Right before I started to drill the holes I remembered a piece of advice that my father gave me when I would help him with home improvement project:  “Measure twice, cut once”.  Now my father did not invent this concept, but I think he did a great service in teaching me the concept.  So before I drilled the holes in the wall, I grabbed my yard stick and measured their location again.  Turns out the first time that I measured, I marked the right hole at 11 1/4 inches from the wall and not the desired 11 3/4 of an inch from the wall (I was holding the yard stick upside down to measure and it is easy to make the mistake in that case).  Since I measured the 2 holes separately, I would have wound up not being able to hang the cabinet at all and would wind up patching one of the holes (which would turn a 15 minute project into a 2 day project).  Taking the extra 2 minutes to measure twice saved me from a lot of effort.

As with many things, I tried to relate the situation to technology, and in particular software development.  The easiest analogy to make is to software deployment and testing.  Deploying code is like putting the holes in your wall, you are really committing.  Testing is expensive, but not as expensive as having to patch software after the fact.  I am not advocating just running the same tests twice; normally you get the same results each time that you run a test (data differences aside, that is how software usually works).  I think I am advocating running two totally different types of tests in order to double check your work.  Maybe the best practice is to run a battery of tests and then to create a new set of tests independent of the battery that you ran before looking for edge or failure tests that you did not anticipate.  Then the next time that you run those test you can fold those into the core test suite.  Just thinking out loud….

Regardless of how you “test twice, deploy once”, you should be approaching your project with that mindset.  It might make you father proud.