General Jared Shockley General Jared Shockley

New Year, New Thoughts

So I post again to the blog. Many of you might be asking where I have been. Many of you might not care. I can accept both sides of the coin on that. It is something that I have been coming to terms with for a while now.

Before and after MS TechEd 2013 in New Orleans, I had a fire to get this blog up and rising. I thought I wanted to use this as a platform for my passion around technology and trying to energize people around me. I went to TechEd and learned a lot but also got to network with some of my "chosen tribe" of IT Professionals from around the world. Events like TechEd make me really sit back and acknowledge my spot in the world while enjoying myself. Many people disbelieve me when I say that I pay for it out of my own pocket and use PTO to attend. I feel that strongly about the experience you get there and the training offered through the sessions, labs, vendor discussions and testing opportunities. I came home excited and wanting to share that excitement with the world. Then, it happened.

At first, Microsoft pulled the TechNet subscriptions from IT Professionals. I understand that this is a way used by many pirates to get licenses to illegally sell but it is used by both IT Professionals and IT Departments to setup testing environments for the systems they have and future desired systems. If my company did not have our TechNet licensing, we would not have looked at SharePoint 2010 for some of our internal needs nor would we have done anything on Windows Server 2008 or 2008 R2. Luckily, we made monies available to get a MSDN account for that work to continue but other companies may not. The response we get from Microsoft is to use the limited day demos. I am sorry Microsoft, those are not functional enough for companies to create testing environments to test integrations and patching. What became worse was the patching errors that started in August and September. Many products had patches that were rolled back because they damaged environments. If we all had proper testing environments, we could find that out before going to production.

If the TechNet wasn't enough, Microsoft Learning tried to slip out their cancellation of the MCM/MCA certifications on a Friday night before a 3-day holiday weekend. Much has been written about this and many MCM/MCA's are upset to say the least. While I do not hold a MCM or MCA, I was building an education track to get one in SharePoint. Some of you may know that SharePoint is near and dear to my heart. It is an underused technology that gets vilified quickly by users. Many times, the configuration and management of the SharePoint system is what the users hate, not the product itself. But I digress. With this announcement, I put my education and certification plans on hold. I am re-evaluating them as we speak but they are vastly different than when I started.

What I feel through these actions is much like what Rod Trent wrote in his September 2, 2013 piece, "Does Microsoft Hate IT Pros?" and Paul Thurrott wrote in his September 30, 2013 commentary "What's Next for IT Pros?", is Microsoft trying to kill the IT Professional. In a way, yes they are. Microsoft is seeing a new future in the cloud and that IT Professionals should embrace this. In some ways, they should become "cloud developers" using PowerShell to manage systems where the infrastructure is a black box layer they need not worry about. For startups and companies with minimal regulations, that is a wonderful story. But to companies with heavy regulations, like SoX, SEC, HIPAA, Hi-Tech, and such, cloud computing is something that just does not make sense today. I can see that they have something in mind for IT Professionals but they are not saying what and that is the key issue.

They need to communicate to their base users, the IT Professionals. Let us know what is going on, what is coming down the pipe other than just cloud computing. Give us more clarity for that higher level of education. One good thing they have done is things like the Microsoft Virtual Academy. This resource has been a boon for many IT Professionals and I do encourage everyone to run over and check it out. Also Microsoft, remember that while developers can expand your platforms, IT Professionals ensure they get deployed into companies. The free stuff at Build would be nice to see at MS TechEd as well. The Surface offer was a great one but then to see developers coming back with 2 free tablets, that did feel like a solid hit to the stomach. Who ensures that the developers have platforms to develop on? The IT Professionals, that's who.

Now, I have got my words out. It has been 6 months in coming. What I can tell those of you that like to read what I write is that I will be writing again. I hope to write one good article per month that has some length to it. At the same time, I plan to use my blog as a "cheat sheet" for me as well. When I hit a brick wall, I plan to blog how I either worked around or knocked it down. To that end, plan to see a lot of how-to's and reviews as well. I have gotten some fun stuff since July and plan to get more. If you have any ideas, do not hesitate to let me know. I am open for just about anything.

Read More
How To and Tips Jared Shockley How To and Tips Jared Shockley

How To: Easy Way to Remove the OneNote “Sent To” App

I write all my blogs on OneNote. I use OneNote on my desktops, Surface RT, XPS laptop and Windows Phones. You might say that I am a OneNote fanboy; you might be right. One thing that I hear as a complaint is how do I get rid of this stupid "Sent To" App that gets installed with OneNote. While I find it useful on some systems, I kill it on others. I am going to explain how to do this on OneNote 2010 or OneNote 2013. As with most Office installations, each version has its own subtle differences. Here's the quick way to get it done on either version.

Steps to perform
Guide in OneNote 2010 Guide in OneNote 2013
1. Open up OneNote
2. Click on the File tab
3. Select Options menu item in the File menu
4. When the Options pop-up appears, select Display from the left menu.
5. Uncheck the "Place OneNote icon in the notification area of the taskbar"
6. Click the OK button to save.
7. Say goodbye to the Icon and App!
Read More
Hardware Jared Shockley Hardware Jared Shockley

Great Alternative to Replace 15K SAS Drives [UPDATED]

One of the most expensive things to replace on servers these days are the 2.5" 15K SAS drives. Having many HP ProLiant servers at my place of employment, getting these drives from HP is a pricy thing for the average small business. Thanks to some clever work by my company's System Administrator, we found a great alternative if you don't care to or do not have warranty coverage for your servers.

It seems to me that the most expensive thing to replace in servers today is the spinning disks. My company has many HP ProLiant servers including the 360, 380 and 580 series servers. The modern generations of these servers, the G6, G7 and Gen8, all utilize the 2.5" 15,000 rpm SAS drives from either Seagate or Hitatchi. When looking at the 146GB sizing of these drives, I typically see pricing from $168 up to $252 per drive for HP name brand drives. Yes, these drives come with HP's warranty but in some cases, that is useless. An example is my company in healthcare. If the drive had ePHI (electronic Patient Health Information), I have to wipe or destroy the drive. If the drive was bad and I can't wipe it, then I have to destroy it. HP will not certify destruction so I have to do it myself thus buying a new drive anyways.

With the majority of warranties covering hardware replacement and the restrictions my team was under, we moved from getting warranties to not getting warranties and just having stores available to change out when problems arise. Now, we needed to tackle the high costs we were seeing. As I said, my Systems Administrator did some digging and searching online and found the Seagate Savvio 15K.2 ST9146852SS Hard Drives. This was an exact replacement for our 146GB 15K SAS drives on our servers. It was a solid replacement spec for spec and the pricing even caught our eye … $96 per drive. You are seeing that correct, $96 per drive. To sweeten the deal even more, I could order these drives off Amazon with my Prime and get them 2 days later in hand. The Amazon vendor selling them, Yobitech, was solid with their customer support and gets my backing for supporting the Prime shipping requirements. They were a pleasure to work with and I will be going back to them with more purchasing.

Are you utilizing your warranties or are other business needs blocking you from using them? What do you think of having stores of spare parts instead of buying warranty programs from vendors? Put your thoughts below in the comments section.

UPDATE: I just noted that the price from Yobitech was $120 per drive. The lower $96 per drive cost is from another vendor. The drives themselves are solid and worth every penny. I am ordering from the new vendor and will update on how the customer service is.

Read More
Jared Shockley Jared Shockley

What To Do With These Amazon Gift Cards

I am a hoarder much like anyone else. I have a lot of stuff in my house that needs to be sorted through and probably recycled, sold or thrown out. My server drives are full of information that "I might need someday". Worse yet, I got 3 Amazon gift certificates in the past 3 months and I still haven't spent the money on anything. I was looking all week at a couple of items and thought I would share some of my shopping fun.

Nokia DT910 Wireless Charging StandFirst and foremost, when I bought my Nokia 920, I got the recharging plate for wireless recharging of the battery. Thinking at the time that it was "gimmicky", I never put much on it for the long term. Wow, did I make a huge mistake. The wireless charging capability of my Nokia phone is one of the top things I love about it, period! Forget my desired for the Windows Phone 8 OS and Microsoft infrastructure I am solidly held into, the fact that I can recharge my phone without fumbling with cables is a godsend. The only downside, my phone is flat on my nightstand. While shopping for Nokia's new wireless car charger/holder, I found the Nokia DT910 Wireless Charging Stand. This will fit my needs to a tee for my nightstand.

On top of the charging stand, I have noticed that my Nokia 920 is getting small "micro scratches" on the Gorilla Glass. I can't think of what could be doing this to the glass but I need to protect it. Thanks to my friend, Mike Bender, there is a lovely solution for that as well. ArmourSuit makes a screen protector for many cell phones including the Nokia 920. The specific model I want, the ArmorSuit MilitaryShield - Nokia Lumia 920 Screen Protector, is pretty inexpensive and does not interfere with the screen's operations. He seemed to be pretty happy with it and at $9.95, I am willing to take a trial run with it. I might even put a bumper case around it as I notice I am getting a bit more clumsy with this phone that prior ones.

Nokia Wireless Charging Car Holder CR200 with Nokia 920 in placeSo, these are the two things that have caught my eye as of this week. If Nokia had released their CR200 Wireless Car Charger, I would have added that to my shopping cart as well with no questions asked. I am going to spend some more time perusing some of the games, filters for my Canon camera and lenses, and other gadgets I can't live without.

What are you buying these days? Spending more time and money on technology? Some other craft?

Read More
Azure Jared Shockley Azure Jared Shockley

My Azure Hosting Hiccups, or "How to Shoot One's Self in Your Own Foot"

As you might have read, I moved my website onto Azure a couple of weeks ago. I have not looked back at all. Well, okay. Two events made me rethink my strategy around hosting on Azure. One was my own doing and the other is a conflict between DotNetNuke and the Azure SQL model. Both were resolved and I am again 100% on hosting via Azure, until the next problem rears its ugly head.

Let's review how I got to today. First, I started out on Azure with my DotNetNuke instance using the DotNetNuke Azure Accelerator. It was a miserable failure and I was floundering. I also had other issues going on that night with various technologies and decided to skip it. Then, I found the ease of setting up my Azure hosted DotNetNuke CMS system. Success!

Let's move on to last Saturday, March 2nd. I decided to do some re-configuring of the website on Azure. First thing, I reviewed my account and my bandwidth and processing needs were pushing the limits of the free account. I had to change from their "Free" webhosting instance to the "Shared" model. On top of that change, I wanted the URL to be my own website's URL and not the azurewebsites.net version that is created when you setup a website on Azure. Lastly, I wanted to use a publishing system so I could upload changes to my site when update came out. In my case, the only one I had some experience in (and not very much as I find out) was GIT but I did not want to tie my Azure site to GitHub, so I selected localized GIT on my desktop. With all of these actions, I pulled out the gun, filled and loaded the magazine, chambered a round, and pointed it at my foot.

Sunday morning rolls around and I get a text message page at 6:30 am; my Azure website is offline. HUH? How can it be offline? Did Azure have another one of their illustrious outages? Looking at the site on my phone, I got at 502 error. Ummmm … "Bad Gateway"??? Thinking my DNS was having issues, I went to the default Azure website URL and got slapped with another 502 error. My site was down! Jumping out of bed, I fumble into my computer and start to look at the issue. I pulled up the Azure Portal, my site, my monitoring services and my VM hosted mail server to get an external perspective on the issue. No matter how many times I pressed SHIFT-F5, the site was down. I checked all browsers and still the same. I had the monitoring service check from all of its servers; still down. Looking through the Azure portal, nothing seemed to be misconfigured. Checking the Azure DB, no issues were seen there. Last check was looking at the webserver logs from Azure; the logs did not show anyone visiting the site. Huh? How could my attempts from my phone, home computer and hosted VM not register in the Logs. I restarted the website services and nothing in the logs. One more SHIFT-F5 and "Ta da!", website functional. HUH? BLAM! That hurt.

I don't like having mysteries. One of the toughest thing for me in my IT world is to have something fix itself and not know what the root cause is. Many of you might remember IBM's commercials around the "Self-Healing Server Pixie Dust". I mock these commercials because parts of servers can fix themselves but others cannot. System Admins are still a necessary group of people no matter what technologies you add to hardware or software. Giving those professionals the information they need to perform good root cause analysis is more important than self-healing. Yet, this is what I was looking at. Nothing in the logs, in the stats, nor in the code told me what was wrong. Nothing like this happened the 7 days I was hosting it on the "Free" model. Being a good IT Operations person, I started rolling back my changes. Doing the easy stuff first, I reversed the DNS work and then went to breakfast. During my meal, I got 10 pages that my site was up, then down, then up, then … well, you get the idea. After breakfast, I went home and switched the site back to the "Free" model. I waited for any changes and was met with similar pages and watching my site go from non-responsive to responsive. My final thought was that the problem must be in the GIT deployment system.

The story turns very interesting at this point. Reviewing the settings for Azure, there is no way for an Azure administrator to remove a deployment system from a website. No mechanism is in the Azure Portal to change once a deployment system is selected. I was stuck with an unstable site and no way to revert back what I did. It seems Azure's method is to just recreate the site. I copied the code from my Azure website to my local computer, deleted the Azure website and created a new one in Azure, copying the code back from my desktop. Thanks to many factors, the file copying seemed to take hours though, in reality, it took 35 minutes for both down and up loads. I clicked on the link for the new site and ".NET ERROR". A huge sigh and facepalm later, I delved into what was going on. DotNetNuke was missing key files; my copy from the internet did not include them. Instead of trying to figure out where I went wrong, I reviewed what I had: an Azure website with code that was bad and an Azure SQL DB with my data. To make it easy for me, I decided to just build a new DotNetNuke installation from scratch with a new DB. Then, recopy my blog data back in to complete my work. After approximately 2 hours of work later, my site was back up and running again on the Azure URL. Success!

Going over all of the changes I wanted to make, I decided to separate out the changes and leave them for 24 hours to verify that it would not affect my site. The critical change I needed to make was changing from the "Free" mode to the "Shared" mode for the website. Azure would block the site if I did not do this because I was over my resources. This was a "no brainer" for me so this was my first change. I re-enabled my redirect from the server that hosted this site before and all was working again. Monday night rolls around and all has been stable. My next change, the URL to my domain name, was prepped and executed. My site was stable for the rest of the night and into the next day. My analysis was correct, the configuration of GIT as a "publishing" system was the cause of my outages on Sunday. Tuesday night led to a lot of review of Azure web publishing. All of the information I was able to find led me to my final conclusion; I am not developing my own code and do not need publishing. None of the systems would help me and only looked to make things more difficult. In its current mode, I can FTP files up and down from the site which is good enough for me.

Let's move on to Wednesday. I received a notice from DotNetNuke that they released 7.0.4 of their system and my site is currently running 7.0.3. I should upgrade it to make sure I am safe, secure and stable, right? As I started to download the code for the update, I got the gun back out again, filled and loaded that magazine, chambered a round, and got it aimed right next to the hole I put through my foot on Sunday. Using FTP, I uploaded the update code and pulled up the upgrade installation page. I waited for the upgrade to complete while working through my e-mail. When it completed, I turned and saw "Completed with errors". BLAM! I got to stop shooting myself like this.

One of the modern advantages of DotNetNuke is the logging that upgrades and installs do now. I was able to pull up the installation log and get the exact error messages from the upgrade installation: 3 SQL errors when it was processing the SQL upgrade statements. Looking at each error, the error messages were confusing to me. In two of the errors, the upgrade tried to determine if an index was in place and then remove said index to replace with a new one. Yet, when this was performed on my Azure DB, it threw an error saying "DROP INDEX with two-part name is not supported in this version of SQL Server". How am I going to fix this? For those of you that don't know, my start in IT was in SQL DBA and programming. I dug out my rusty SQL skills and started through the database alongside online the MSDN website for Azure SQL. In no time, I figure out what I need to do to modify the DotNetNuke code and run the SQL statements against my Azure SQL DB. The third error was even more interesting. The DotNetNuke code wanted to verify that a default value was set for a column in one of the tables. The way this is done normally in SQL Server is to query against the sys.sysconstraints system view. The problem with this in Azure SQL DB is that there is no sysconstraints view available. The SQL statement that ran returns "Invalid object name 'sysconstraints'". More digging and I found my answer; Azure SQL has the new Catalog Views of check_constraints, default_constraints, and key_constraints available. Quick change to using the default_constraints view and I found that the desired default was in place. My upgrade is now complete and a success.

As you can see, I did all of the damage myself; I cannot blame Azure for it. My impatience to not read all the way through and just get things going caused my own downtimes. I have no doubt my thrifty behavior will also be my downfall when Azure has any sort of outage in the US West Websites or SQL DB layers. If I want a website that will not go down, I need to create and pay for the Azure infrastructure to do that. For now, I am super happy with my decision. To the cloud!

Are you thinking about moving your website into a cloud provider? If not, what is stopping you from doing that? Post your questions and comments below.

Read More