I am very lucky to know many great people in the IT world. These people are experts in their fields, outspoken about what they know, and love to pass along this knowledge. One of these people is Phoummala Schmitt, aka Exchange Goddess. She is a contributor to the Petri.com website and someone fun to follow on twitter. She started a podcast with two other great IT Experts, Theresa Miller and Melissa Palmer, called Current Status. I have had a few friends appear with them.
Now, let's look back a few months and I get a DM from Phoummala. "Hey, we want you to appear on Current Status to talk about SharePoint." All I could think was, "Is she serious? Is she setting me up for some joke about wanting to talk about SharePoint or are they going to just make fun of me and SharePoint." Knowing Phoummala as I do, I knew that this was not going to be the case, or at least 100% of the case. We setup a time and away we go.
Guess what fans? That show is tonight. You can head over to YouTube and watch it live at 10:00 pm Eastern/7:00 pm Pacific. I will be joining from my hotel room in Dallas (crossing my fingers about bandwidth) to chat with Phoummala, Theresa, and Melissa on SharePoint, Office 365, unicorns, and probably some Azure. Won't you join in and watch?
If you read my prior post about my new Hyper-V rig, I wanted to get some of my current systems over there running so I can see the performance differences. I also wanted to get those VMs off my 2008 R2 Hyper-V host to possibly upgrade it to 2012 RC as well so I can do some clustering between them. While there are many ways to do this, I did my moves the complete manual way. Kids, do not try this at home …
With my brand spanking new rig, I wanted to get some VM's running on there other than a couple I spun up to test the disk I/O and the memory allocation. I wanted to get some real load on the box. To do that, I needed to move some of my other machines from my 2008 R2 Hyper-V host and convert them to run on 2012. There are many ways to accomplish this with many tools but I chose the worst of all routes to ensure that the systems converted just fine.
As all of the servers needed their monthly patches, first thing I did was patch the servers. Yeah, much like many IT folks, my servers needed their patches installed. Similar to the statements around the shoes of kids of cobblers and the homes of carpenters, IT people do not always practice what we preach. One thing I hope to setup with the additional hardware and space is a patching system to do it for me. Also, since I was patching the VM's on my 2008 R2 host with bad disk I/O, this process was very long as I had to patch each machine individually.
As each server finished its patching, instead of rebooting the box fully, I just shut down each box. I then copied the VHD file from the 2008 R2 host over to the 2012 host and performed a hard drive conversion to VHDX. For my web servers that connect to a file server for configurations and website content, I turned on the VM on the 2008 R2 host while this conversion was completed.
As I have 2 web servers, the copy took about 5 minutes and the conversion about 20 minutes for each web server. By turning on the web servers back on the 2008 R2 host, I could work on the conversions and setup on the 2012 server while my sites still operated. Once all was set on the new VM's hosted by 2012, I had the NIC settings to change the IP of the server's NIC ready to hit "OK" and shut down the 2008 R2 host's VM. Downtime was about 10 seconds on one server and about a minute on the other. The only reason why the downtime was longer on the second server was my fat finger on the gateway address for the server. With that the web servers are moved to my 2012 server.
The next of my servers I chose to move was my Exchange 2010 server. In my lab, I was running a single-server installation (waiting now for all my friends to give me hell for this installation) and wanted to move it over to my new 2012 system as well. As the only mail going through the server was my own addresses, taking it offline was not an issue. After it completed its updates, I moved the VHD over to the new 2012 host and started its conversion but did not restart its old VM back on the 2008 R2 host. Once the conversion was done, I created the new 2012 VM attaching the freshly converted VHDX and brought it online.
With installation of the new Hyper-V Integration tools and drivers, all of the VM's were brought up to 2012 hosting standards. Currently, I am running only 2 VM's on my 2008 R2 hosting system. The two left are my SQL 2008 and my Windows Home Server 2011 systems. In reviewing my process for moving my systems, I could easily do the same steps to the WHS 2011 image except that the backend iSCSI connection is provided by my DroboPro. I need to refigure that out but I am not happy with my current WHS configuration. I do not feel it offers me anymore advantages as a single device as I don't use its remote access capability nor its media streaming. It has ended up a file server and I would prefer to run a real file server with Windows 2012 instead. As far as the SQL 2008 server, I plan to build a new one on the 2012 host and mirror the databases across, then use the 2012 hosted VM as my primary.
Many might say I am taking a big risk in that there is no knowledge of an upgrade path from 2012 RC to 2012 RTM. I have also done something most might say was dumb in moving the VHD's manually. I feel the risk/reward of building new VM's with the VHD from the 2008 R2 host versus installation of tooling that either would take a long time versus doing what I did. If Microsoft allowed 2012 Hyper-V tools to work with 2008 R2 servers, I might have used some of the other tools but that will not happen by the looks of things. I have 3 of my main servers over on the new host and looking pretty damn good.
Next, I will be working on both the SQL and WHS/File server. I will post more about that work when I get to it.
We all want new toys, right? That excitement about opening the gifts … wondering what was inside … hoping for exactly what you want … the thrill of seeing the new toy. I remember my birthday and Christmas of many prior years; the excitement of the wrapped packages; the adrenaline as the wrappings are pulled off; the joy on my face as the new thing was in my hands. Those are the same feelings that we get into today. While I may be the one buying those gifts, the joy of opening everything I get from the places I order from.
In January, I read a blog entry (http://www.expta.com/2012/01/blistering-fast-windows-server-parts.html) by Jeff Guillet (pronounced GEE-yay for all of you) about his new Hyper-V server rig for his home lab. Reading through, I was really interested in building my own rig. I have my own machine right now. It has a Quad core processor with 12 GB of RAM and running Windows Server 2008 R2 from my TechNet licensing. This has allowed me to do some testing of systems like Exchange 2010, SQL Server 2005 and 2008, Windows Server, client virtual machines, and Windows Home Server. This rig has been pretty good at providing me what I need for testing but I have to run specific machines at differing times because of the processor and memory restrictions. I also have learned how to run machines on smaller footprints; something I can't tell is a good or bad thing.
Since reading this great article, I started in on my own certification path via the "60 Days to MCSE". In this endeavor, I realized that my current Hyper-V host was not large enough to use for this. I sat down with Rick Smith (@slegsmith) from my IT team at my day job to discuss what Jeff put together. The goal of this system was to utilize as much newer technology, have a minimum of 32GB of memory for virtual machines, and responsive drives for the virtual machines as well. In the end, Rick came up with some great hardware. I was lucky enough to have a case and power supply already but had to get motherboard, CPU, memory, and storage drives. Here is the order I made:
I finally got this system together in late May and I fell totally in love with this rig. I was able to put it all together and start building it with Windows Server 2008 R2 as a Hyper-V host. I could have built it with a "core" install but I have run into issues in management of "core" installs in the past. Once everything was installed, I was very impressed with the performance. To verify my person observations, I ran some performance tunes and got the following results:
Storage Performance Statistics
The Samsung SSD Mirror RAID statistics were:
The Velociraptor statistics were:
Memory and CPU Statistics
Core i7-3960X Extreme
Asus Sabertooth 990FX
Core i7-990X Extreme
Memory Read in MB/s
Core i7-965 Extreme
ASUS P6T Deluxe
Calc Memory in MB/s
4x Core i7-965 Extreme HT
6x Phenom II X6 1055T
Unganged Dual DDR3-1333
8x Xeon L5320
8x Xeon E5462
8x Opteron 2378
Tyan Thunder n3600R
The tools I used were:
I had started to build my SCCM environment but ran into time constraints before I was able to go to MS TechEd. After I returned home, I reinstalled the OS with Windows Server 2012 Release Candidate and just have to say "WOW!" Instead of running the 3 Velociraptors in a RAID or as separate drives, as I had with my initial install of Windows 2008 R2, I created a storage pool across the 3 drives with parity. This makes it act as a RAID but not using a RAID card and the through put was amazing; my IOPs were through the roof. Here are the different HDD Tune Pro results:
The Samsung SSD Mirror RAID statistics were:
The Velociraptor statistics were:
Clearly, the storage pool is the way to go for me in this build. I will blog again about how it is working in a few weeks.
Another monthly "Patch Tuesday" just passed by and, like many folks, I updated all my systems to include these updates. I also awoke Wednesday and Thursday to machines that had been rebooted because Windows patched itself and rebooted the computer. In both cases, I was upset as I had open documents that I had not saved. To make this easier, many software packages are going to background updating with no notice to users. I question if this is really better for users.
Everyone knows about software updates thanks to Microsoft and their Windows/Microsoft Update system. Starting out with the "Windows Update" website with Windows 95, the updates were delivered via the internet. Later versions of Windows had more integration with the Update service to the point of the current incarnation of Windows Update built into Windows 7 and soon to be released Windows 8. Microsoft gives the user many levels of customization around the update process. One option, set as the recommended standard by Microsoft, is to install critical updates and reboot after the installation. This has caused issues for many users where the computer is rebooted without letting the user know. Users have complained about losing data .
This has cause Microsoft to provide deep customizations around their updates to ensure no data loss.
Windows 8 changes this again. Having gone through a few months of patches with my Windows 8 installations, both Customer Preview and Release Preview, I prefer the new updater. Windows 8 performs the monthly patching using the Windows/Microsoft Update process as before. Users can customize this experience but the reboot is the key difference. Windows 8 gives the user notification that they should reboot within the next 3 days before it is automatically done. Finally, Microsoft is on the right path! The only thing better Microsoft can do is figure out to apply the updates without requiring reboots. As the Windows NT Core becomes more and more modular, this should be easier to do. Only the core elements would require the reboot while all subsystems could be restarted with new code.
Now, take a look at how Adobe, Mozilla and Google are doing their updates. Almost all of them have changed how they are doing their updates for their main products: Flash for Adobe, Firefox for Mozilla, and Chrome for Google. Their most current versions, as well as earlier versions of Chrome, are now setup to automatically download and install updates. If the default settings are used, all of them do this without notifying the user that there is a change. The only way to find the current version is to look in the package's "About this product" page or screen. I have not yet heard of issues with this process but a major concern is what happens if a bad release happens? Users would be confused as to why their computer wasn't working. A good example of this was Cisco's firmware update of Linksys E2700, E3500 and E4500 in late June. The update forced users to no longer use a local administrative system but a cloud-based system. There were issues with the cloud-based system and what information it tracked. With no other way to manage their routers, users are given no choice all cause by automatic updates. Cisco has reversed this but it is impacting their perception by users as many are not happy and some even returning their units.
As a manager of IT services, this concern is my biggest concern and makes me unwilling to support products that update automatically in the background. Within a managed environment, unannounced changes cause many problems. Microsoft created its monthly patching update cycle that is has around this design for enterprise environments. It is truly built around IT management systems. The updates are announced upon their delivery and allows IT teams to review and determine their risks for the organization. It also allows for testing cycles and deployment systems managed by the IT teams. The new unannounced automated updates do not allow for this.
With this movement to unannounced automated changes, some in the tech world think this change as the best thing for users. One argument is that it is good for developers as products keep improving, mentioning that it is similar to how web applications can be upgraded without user intervention. This is a bad comparison as web applications can be fully tested and conformed to "standards". Applications installed on a users' computer are more difficult. Did the software publisher check it in all configurations? This is much easier in controlled platforms like Apple's iOS and Mac OS X. With Microsoft's Windows platform and Linux based operating systems, this cannot be done easily. In one way, the fact that Microsoft can make Windows work on so many different configurations working with the hardware providers is absolutely amazing. I would suspect that Adobe, Mozilla and Google do not do this sort of in-depth testing.
I can see automatic unannounced updates for consumer users being a positive thing but personally do not like it at all. I have told Adobe to inform me of updates of Flash instead of just installing it. I am using a version of Firefox that does not have this automatic update when I need to use Firefox and have stayed on IE mostly for my personal use. To my dismay, Microsoft is now going to start performing automatic updates like Chrome and Firefox. My hope is that they offer a manage system for IT teams to control this process. Having worked at Microsoft, I wonder what the internal IT teams there think of this automatic update process.
Further automating the update process will make more users up-to-date and improve the overall security of the internet. Microsoft showed this with the move to the monthly patch process. Currently, statistics from security sources like Kaspersky Lab show a major shift in malware writers from attacking Windows directly to using other software as the attack vector, the most popular being Adobe Flash and Oracle/Sun Java. This opens up the malware folks to infecting more than just Windows, but Apple Mac and mobile devices like iOS and Google Android. The response to these threats is to do automated updates of those attack vectors. This helps users and increases security on the internet, but Microsoft has shown that a standard cadence can work. Adobe did try a standard cadence for updates to its products but has not been able to keep to a cadence due to the severity of their security issues being patched as of late. Instead of trying to make it work, they are moving to the models popularized by Google and, then, Mozilla.
The downside to all of this is the platform for upgrades. Every product seems to need to make its own product for monitoring for and applying new updates. Google and Mozilla both now install their own updater service that runs on the computer all the time and with administrative privileges. That is the only way for a service to run and install code without user intervention. My IT "spidey senses" go on high alert any time I hear this. Right now, on many home computers, there are most likely 5-10 updater services of some sort running. One solution is to have the operating system provide a standard mechanism for this sort of updating. Another is to use the task scheduling system of the operating system to schedule checks for updates. One great opportunity is the CoApp project headed up by Garrett Serack (@fearthecowboy) with many contributors. This could be a single updater that all the packages could use for their updates. Some sort of standardized and single point for updates would make users' systems run cleaner and happier.
The issue of unpatched systems on the internet is a major one for all of the computing world but especially for IT teams and their configuration management. In my review of ITIL/ITSM management philosophies, the configuration management part is the most critical aspect. Controlling change is how an IT team keeps a company running. It is the one area that most IT teams do not do well and it shows. If the push is to these unannounced automatic updates for web browsers and more companies use web tools to run their companies, how will they verify that all the web tools are going to work with each update? Will they see more Helpdesk calls from users confused when sites and tools don't work? What do you think?
I am a hoarder much like anyone else. I have a lot of stuff in my house that needs to be sorted through and probably recycled, sold or thrown out. My server drives are full of information that "I might need someday". Worse yet, I got 3 Amazon gift certificates in the past 3 months and I still haven't spent the money on anything. I was looking all week at a couple of items and thought I would share some of my shopping fun.
First and foremost, when I bought my Nokia 920, I got the recharging plate for wireless recharging of the battery. Thinking at the time that it was "gimmicky", I never put much on it for the long term. Wow, did I make a huge mistake. The wireless charging capability of my Nokia phone is one of the top things I love about it, period! Forget my desired for the Windows Phone 8 OS and Microsoft infrastructure I am solidly held into, the fact that I can recharge my phone without fumbling with cables is a godsend. The only downside, my phone is flat on my nightstand. While shopping for Nokia's new wireless car charger/holder, I found the Nokia DT910 Wireless Charging Stand. This will fit my needs to a tee for my nightstand.
On top of the charging stand, I have noticed that my Nokia 920 is getting small "micro scratches" on the Gorilla Glass. I can't think of what could be doing this to the glass but I need to protect it. Thanks to my friend, Mike Bender, there is a lovely solution for that as well. ArmourSuit makes a screen protector for many cell phones including the Nokia 920. The specific model I want, the ArmorSuit MilitaryShield - Nokia Lumia 920 Screen Protector, is pretty inexpensive and does not interfere with the screen's operations. He seemed to be pretty happy with it and at $9.95, I am willing to take a trial run with it. I might even put a bumper case around it as I notice I am getting a bit more clumsy with this phone that prior ones.
So, these are the two things that have caught my eye as of this week. If Nokia had released their CR200 Wireless Car Charger, I would have added that to my shopping cart as well with no questions asked. I am going to spend some more time perusing some of the games, filters for my Canon camera and lenses, and other gadgets I can't live without.
What are you buying these days? Spending more time and money on technology? Some other craft?
Well, I have been using my Surface RT for over a week now and been finding it's finer points that work for me and the things that drive me absolutely batty. Overall, I have found the Surface RT to be a complete replacement for my Asus EP-121 slate. However, I will not be selling/giving away my EP-121 just yet.
Let me explain my main uses for computers in my life. Being that I am a Director of IT in my day job, I am surrounded by technology a lot. My current "arsenal" of systems include:
That is just my home workstations and servers and my workstation at work. That doesn't include my phones or other mobile tech.
Since Windows 8 CP, I have had Windows 8 installed on my Asus EP-121 and enjoyed the environment immensely. I knew from that experience that touch was going to be a key to the success of Windows 8 in general. The UI felt comfortable with mouse and keyboard but was geared for the touch interaction. The overall experience was good, except for driver management and occasional crashes.
Move the calendar to July and Microsoft's announcement about the Surface line. I was jazzed to hear this foray by Microsoft into the hardware platforms. To date, none of the slate/tablets that the partners put out were the experience. I did like my Asus EP-121 and Samsung had a solid device in their Series 7 slates. Microsoft was going to put a flag in the ground saying this was to be the premier experience of users on the Windows 8 and Windows RT platforms. I spent a lot of time pouring over the specs and restrictions of each system which drove me to an idea.
Starting in July, I restricted my use on the EP-121 to what would be available to me on the Surface RT; I would use M**** apps, Office apps (Word, Excel, PowerPoint and OneNote only) and other built-ins. My mobile digital life revolved around this device and it worked out for me pretty well.
Flash-forward to October 26th and the arrival of my Surface RT; my excitement was immeasurable. I busted it out and started to play with it over the weekend. So far, so good for me with the Surface RT. Now, I have been using the Surface RT all week for both personal and work type activities. Here's what I have found so far:
I go back to my original post on the Surface RT and re-affirm the purchase criteria:
How has your experience with the Surface RT been? Are you a developer and just got yours at BUILD? If you own another tablet, are you considering a Surface RT?
Microsoft Press has been putting out a lot of books lately and many are free. These books will make great additions to you electronic libraries. They were kind enough to put up a blog listing on their blog at http://blogs.msdn.com/b/microsoft_press/archive/2012/05/04/free-ebooks-great-content-from-microsoft-press-that-won-t-cost-you-a-penny.aspx. These offerings from Microsoft Press are in PDF, MOBI and ePUB formats. Books currently offered include:
Many of these books are introduction books but can provide a lot of information about the products. They can also start guiding you to other resources and give you a stepping stone to learning the new products.
If you have an Amazon Kindle, you can transfer the MOBI versions to your Kindle and to your library. To perform the transfer to your library, utilize the Send to Kindle Software from Amazon. This does not work with the free Kindle reader apps for mobile devices or computers. On the Windows PC reader app, you can open the file directly into the Kindle application on that system but it does not go to your library at Amazon.
Grab the opportunity to get the free books today!