Lighting, printers and Monday

http://docandersen.podbean.com
https://docandersen.wordpress.com
http://scottoandersen.wordpress.com
My Amazon author page!!!!
http://www.safegov.org

Neon accent lighting for the home office is now officially something I looked at. I feel embarrassed to admit that. They say however, the first step to self improvement is admitting you have a problem.

The problem with a home office that you add to a home you’ve purchased is that in the end the lighting is always wrong. For me there are a couple of lights that need to move about a foot one way or the other.

Perfection is not just a state of mind. In the end it becomes an expression of what you want to do.

We are moving to wireless and network printers in the house. I am not sure why in the end we have so many printers. It’s a legacy from the old days. I don’t print anywhere near as often as I used to. Barb still prints frequently using both color and laser printing. I find myself printing less and less often.

Partially because I don’t need to. But I still have a few printers in the house. Mostly the odd format printers (large and photo) that allows you to print various things easily. Its nice to be able to take your phone, plug it in and print a picture you’ve taken. The plotter is from the days when I was a practicing software architect and needed to look at paper versions of networks and larger workflows. I haven’t done a print out on the plotter now since my last big project which ended a year ago.

The thing about printers is, when you need one it helps to have one available. When you don’t need one it kind of sits there taking up space. Catch-22 Mr. Heller, thanks for creating the term. In the end however, you have to have them.

.doc

Scott Andersen

IASA Fellow.

Free Cloud Storage–no really its free…

http://docandersen.podbean.com
https://docandersen.wordpress.com
http://scottoandersen.wordpress.com
My Amazon author page!!!!
http://www.safegov.org

There is an interesting gamble going on right now in the technology/cloud space that I find intriguing. Google slashed the pricing on Google Drive to 1.99 for 100 gigs. In order to get that amount of storage on an HD you would in the end have to pay about 60 dollars. If you ran it for a year or two, you would hit the MTBF and would ultimately need to replace that drive. That means every 2-4 years you buy a new drive. Figure a fair number is 100 bucks.

At the time time for the same space you spend 4 bucks with Google. what a value proposition.

Now, the intriguing part. When you purchase new hardware (a new Chromebook for example) they give you a two year free account with 100 gigs of Google Drive space (or 50 gigs of Dropbox or 50 gigs of Box storage). They are betting that after using the space for two years you aren’t going to want to stop. I am better that pricing will drop through the floor in two years and I will pay a dollars for two more years of Google Drive or OneDrive, or Box, or Ubuntu or Dropbox and so on and so forth.

The freebies model is intriguing but in the end not as risky as you would think. I upgraded to a larger Amazon drive and was willing to continue paying for the storage why? Because Amazon preloads the CD’s I buy into my Amazon drive.

The value of CloudHQ has a solution increases in the new world. I still think their subscriptions are far too expensive overall but now, with 200 or more gig of cloud storage you can find a much stronger case for moving the data around.

This leads me to an interesting evaluation going forward. What are the criteria you need to consider in selecting cloud storage going forward.

Vendor name

Ease of Use

Ease of Access

Storage types supported

Overall cost

X cloud storage        
Y Cloud Storage        

Based on this framework we can begin evaluating the drives spaces on what we need overall. For example I buy new music from Amazon and Apple – so ease of use is high for those two. Ease of access is less so for Apple but improving. In building out the matrix you can evaluate the drives for both enterprise and personal use. I am not sure that you would add anything other than overall security for an enterprise solution in this space.

The times they are a’changing…

.doc

Scott Andersen

IASA Fellow!

The best way to move forward is to learn from your mistakes.

http://docandersen.podbean.com
https://docandersen.wordpress.com
http://scottoandersen.wordpress.com
My Amazon author page!!!!
http://www.safegov.org

Brittle Computing (continued)

When I started out I was a helpdesk professional. There is a certain reality to working on a helpdesk. You learn quickly that there is a human being on the other end of the phone. You also learn quickly what the skills and capabilities of that person are.

I used to have all sorts of support processes memorized. I could without seeing a computer screen walk someone to the configuration tab of a Lotus Notes Connector in Exchange 5.5. I used to (but can’t now too much other stuff in my head) be able to do Windows NT 4 configuration without seeing the screen. That is a skill you perfect on the helpdesk. Seeing what is actually on the other side of the screen.

That said it is also a huge part of what I believe to be Brittle Computing. There is a corollary to Murphy’s Law that says loosely work on something long enough to improve it and it will break. That is the reality of Brittle Computing. You couldn’t make wholesale changes without a path backward.

With Cloud solutions you can easily (and intelligently) remove some of that risk. Virtualization gives us the reality of a snap shot of our VM’s so that we simply roll back easily and quickly. The concept of design for failure is a carry forward from the Brittle Computing days but the meaning has changed a little.

Brittle Design for Failure: Have a backup, have an outage window, start the upgrade. Have warm spare ready. Verify upgrade worked (or failed) if failed go to backup. Bring backup online, and evaluate why the system failed to update properly (root cause analysis).

Modern Design for Failure: Build resilient system that can adapt around failures. If the system fails during upgrade (part b) bring part a online. What’s an outage window?

The best way to move forward is to learn from your mistakes!

That is the motto of a Brittle Computing person.

.doc

Scott Andersen

IASA Fellow.

Brittle Computing…

http://docandersen.podbean.com
https://docandersen.wordpress.com
http://scottoandersen.wordpress.com
My Amazon author page!!!!
http://www.safegov.org

I first learned UNIX commands in 1994 when the company I worked for decided (at my suggestion) that a SoftSwitch box would solve our mail routing problems.

It was DGuX (that is what they used as the base for the SoftSwitch system). I learned LS, and a number of other commands to run daemons and connect to a variety of systems.

I’ve been playing with Ubuntu for the past month or so. I have to say it is without a doubt a significant improvement over DGuX. It is fairly easy to get up and running and using. Now the desktop software is impressive as you don’t have to do anything, but I’ve also been playing with the server os.

The easy thing was getting apache up and running. Now I am learning how to secure the solution so what I built isn’t vulnerable (it isn’t now because it is on a non-routed segment).

I grew up as an IT person in the age of brittle computing. (without an Internet you had to wait for drivers to be shipped to you or for the tech to arrive). We didn’t make whole-scale changes back then without a great backup plan. It was the dawn of the computer age and a very different time.

Back when I started you had to have a backup. Now you can easily have your data in the cloud and reinstall all your applications in no time flat simply via an Internet connection. It charges the operational paradigm in the end. You see brittleness makes you careful.

I wonder, talking to a number of different customers over the past 20 years if in fact we (those of us who grew up in the brittle age) couldn’t do a better job of conveying that to people just starting out. The concept design for failure does in the end come out of the dinosaur age of computing, but perhaps we could convey the reality of back up and restore as well. Sometimes things are lost because people don’t do a great job of sharing.

So, my first blog in the Brittleness series is this one. Watch for more in this series!

.doc

Scott Andersen

IASA Fellow.

Consumption based utility Wireless (wi-fi only)

http://docandersen.podbean.com
https://docandersen.wordpress.com
http://scottoandersen.wordpress.com
My Amazon author page!!!!
http://www.safegov.org

I have railed for more than 10 years on the concept of network bandwidth (first I said the Internet won’t take off until broadband access is wide spread). Now I’ve been worried about the reality of what will happen at home. There is only so much air space you can consume.

That got me thinking about the devices and the connections they have. As devices get more intelligent they will hopefully being to build on the concept (I am usually used from 5-7 pm each day. The home network is used from 5-8 in the morning and from 3-10 at night)of intelligent use. That is a concept that says if I need to update or download information I should do it during the low times of the network not the peak times. Today if your smart device needs to update it happens in the peak times not the off times most often (it requires your interaction).

We should as we consider this overall problem also consider the reality of what will happen. The number of devices that want (demand) connections is increasing. The reality of those connections has increased as well. Cellular devices (back in the day) only had voice connections and maintained a steady connection outside the home. They didn’t impact your home bandwidth much other than airwaves consumed. Now they connect to your wi-fi and move information back and forth. Your laptop, desktop, television etc as discussed on this blog all move data.

The thing that I am wondering about today is the reality of intelligent connection. Do all these devices need to be connected at all times?

Why with all the ability we have today can we not simply have devices that use wi-fi as a utility. Truly apply that broad utility concept and only use wi-fi when you need it. Not always on, beaconing rather always off and only connect when something is needed.

I have for years now talked about the concept of intelligent sync (in fact CloudHQ on ChromeOS is a great implementation of the initial part of the concept connecting everything to everything) as a component of continued compute improvement. Now I wonder if in fact we shouldn’t ask for wi-fi that is on only when being consumed.

I am just saying…

.doc

Scott Andersen

IASA Fellow

Fasten your seatbelts and please show your work…

http://docandersen.podbean.com
https://docandersen.wordpress.com
http://scottoandersen.wordpress.com
My Amazon author page!!!!
http://www.safegov.org

I’ve had MIDI devices connected to my computer for (insert image of person counting on fingers, toes and running out of appendages using the pencils on the desk) more than 24 years now. I am not a musician I just love to write music for my family.

Recently I’ve been thinking about all the cables in my office. I’ve found a solution for some of them – in particular MIDI cables with the new PUC device from Indiegogo. I haven’t connected it yet but will be this weekend. Will post a review on my other blog when I get things up and running.

The reason today however, for this discussion has to do with connections. There was a time when if you wanted to connect MIDI, input sound and video you had to have a number of specialized boards and connectors with your computer.

Now you can do all of that with USB.

Tomorrow that will be wireless.

Of course, then we go to my other blog theme (what happens when you house runs out of bandwidth). You can segment your network intelligently and still in the end run out of bandwidth.

The math is actually quite simple. X = the total bandwidth available from your ISP. X is bound by time (during the evening and weekends more people using the total bandwidth for your neighborhood reduces your total available bandwidth). So X becomes stopping point 1. Y= the total bandwidth and spectrum used in your house.

  • Video surveillance systems chew bandwidth.
  • Netflix (at least in my house) on two systems reduces total bandwidth.
  • Computer updates, downloads, connections and so on also removes bandwidth from within your house.
  • Sensors and weather systems remove more bandwidth.

So now we have the concept of X-Y = Z your net available bandwidth. The result by the rules of computer networking cannot be a negative number. So where Y is variable X is a constant and the result is that Z has to be a positive number.

At some point in reality Y has to become a constant. You can use N, G and B to spread the spectrum but you run the risk of hitting the same spectrum as both cellular and hand held land line devices.

Back to 9th grade. Solve for Y where X may change randomly. Oh yeah and Z can’t be a negative number.

Please show your work.

.doc

Scott Andersen

IASA Fellow

Why isn’t Windows 8 better?

http://docandersen.podbean.com
https://docandersen.wordpress.com
http://scottoandersen.wordpress.com
My Amazon author page!!!!
http://www.safegov.org

When windows 7 shipped (what is it 4/5 years ago now) it was bar none the best operating system period. OK some of that is propaganda but in terms of what the product encompassed it was the best OS available. Yes the Macintosh was certainly good, but it got better during the time of Windows 7. A whole lot better. Chromebooks and ChromeOS were out but the also got better during the life of Windows 7.

I no longer think Windows 8 or even 8.1 are the best OS’s now. I think they are in second place for the most part and often third or fourth depending upon the task you have at hand. So let’s rate OS’s based on simplistic of preforming a specific task.

Task 1: I want to have my files available to me wherever I am.

  1. ChromeOS
  2. Windows 7/8
  3. Macintosh and iOS
  4. Android
  5. Ubuntu 12 LTS

First task rated – ChromeOS came from last place in my eyes just 3 years ago to first place. The more I use CloudHQ and the easy movement of my information along the path’s I want makes it the clear winner. Windows 7 is tied with Windows 8 for second – not the best start when you can’t even beat your old product. Now if we limit this to OneDrive only – MS wins. But the world isn’t about one drive only anymore. (buy a new Chromebook for yourself – you get 100 gigs of Google Drive storage for 2 years for free. Thereafter its only 1.99 a year for 100 gigs seriously). (wow – this one really hurts Jim Wilt – you were right.)

Task 2: Use MS Office Apps to start, edit and publish information.

  1. Windows 7/8
  2. Macintosh
  3. iOS
  4. Android
  5. ChromeOS
  6. Ubuntu LTS

What scares me here is that the Market for number 3/4 has become a huge un-served market. From an MS perspective they have no clear product after number 2. They haven’t really improved office on Windows 8 to make it better than the Windows 7 experience.

Task 3: I want to draw a circle on a map to denote where my house is for guests I have invited.

  1. iOS
  2. Android
  3. Windows 8 with touch input
  4. Macintosh with touch input
  5. Windows 7 with touch input.
  6. ChromeOS with touch input
  7. Ubuntu LTS

Now we start to see the missing reality. I can spend a ton of money buying the hardware and software required to make my Windows machine operate like a tablet. By moving to the touch world I do finally beat Windows 7. But I can’t beat the solutions that are the tablet market today.

Overall Windows 8 is a very nice upgrade (I hate the metro UI). Its smooth and the functioning with touch is great. I am looking at and working with two different machines and frankly the tablet pc format is much improved from Windows 7 to Windows 8. I just wonder if they folks in Redmond missed the boat.

.doc

Scott Andersen

IASA Fellow.