Microsoft Jared Shockley Microsoft Jared Shockley

I passed a Microsoft Beta Exam!

It was the strangest notification I have ever seen in my e-mail on Friday morning. I heard from Microsoft Learning inviting me to the Microsoft Certified Professional website. I was given codes to attach my Microsoft Account to the MCP site and create an account there. This came out of the blue and I tried to remember back to the tests I have taken. Then, it dawned on me … I passed a beta test for Windows 8 Configuration.

I sat at my desk in awe. I had heard from others that took the beta tests at North American TechEd that results still had not been posted. Some were getting a bit "testy" about it. Folks from MS Learning was working with them to get the scoring from the testing company but I totally forgot about checking my own scores. It literally took me a few minutes to realize what had happened and that I passed the exam. I will be honest, with my test scores from TechEd, I did not expect to pass. I learned so much about the certification exam taking process and preparation that I did not hold out any expectations.

After getting my code entered, I was informed on the MCP website that I now hold a certificate in Windows 8 Configuration. I am eager to finish off the MCSA: Windows 8 full certification as soon as I can. As I stated before in this blog, I did not work for my certifications before. Now, having been to a couple of TechEd conferences and getting to know others IT Professionals and Trainers, I fully see the value and started work on my own certification. This first one under my belt may be a basic one but it feels good to pass a test after failing a few at TechEd. I am more understanding of what this process entails and will be moving forward with more certifications, both technology and process.

What sort of certifications have you received and are trying to attain? Are you looking at technology, process, or management certifications? What value do you find in certifications?

Read More
Microsoft Jared Shockley Microsoft Jared Shockley

Like SysInternals Tools? Want to Keep Them Automatically Updated?

SysInternalsLogoIf you are an IT Professional and have not heard of the SysInternals tools, you need to get out of the server room more often. SysInternals tools, originally NTInternals and then WinInternals, were built by a company headed by Mark Russinovich and Bryce Cogswell. Their company, started in 1996, offer many tools for IT professionals. In 2006, Microsoft purchased the company and all of its assets including bringing Mark Russinovich to Redmond to join Microsoft.[1] These tools can help IT Professionals and developers understand what is happening when things go wrong.

I want to briefly cover the tools available as they can be forgotten and yet be so amazingly helpful when trying to figure out the problems that come up. Some of my personal favorites are: [2]

  • AdExplorer - Active Directory Explorer is an advanced Active Directory (AD) viewer and editor.
  • Autoruns - See what programs are configured to startup automatically when your system boots and you login. Autoruns also shows you the full list of Registry and file locations where applications can configure auto-start settings.
  • BgInfo - This fully-configurable program automatically generates desktop backgrounds that include important information about the system including IP addresses, computer name, network adapters, and more.
  • BlueScreen- This screen saver not only accurately simulates Blue Screens, but simulated reboots as well (complete with CHKDSK), and works on Windows NT 4, Windows 2000, Windows XP, Server 2003 and Windows 9x.
  • Coreinfo - Coreinfo is a new command-line utility that shows you the mapping between logical processors and the physical processor, NUMA node, and socket on which they reside, as well as the cache’s assigned to each logical processor.
  • Disk2vhd - Disk2vhd simplifies the migration of physical systems into virtual machines (p2v).
  • Diskmon - This utility captures all hard disk activity or acts like a software disk activity light in your system tray.
  • DiskView - Graphical disk sector utility.
  • ListDLLs - List all the DLLs that are currently loaded, including where they are loaded and their version numbers.
  • ProcDump - This command-line utility is aimed at capturing process dumps of otherwise difficult to isolate and reproduce CPU spikes. It also serves as a general process dump creation utility and can also monitor and generate process dumps when a process has a hung window or unhandled exception.
  • Process Explorer - Find out what files, registry keys and other objects processes have open, which DLLs they have loaded, and more. This uniquely powerful utility will even show you who owns each process.
  • Process Monitor - Monitor file system, Registry, process, thread and DLL activity in real-time.
  • PsExec - Execute processes on remote systems.
  • PsFile - See what files are opened remotely.
  • PsGetSid - Displays the SID of a computer or a user.
  • PsInfo - Obtain information about a system.
  • PsKill - Terminate local or remote processes.
  • PsList - Show information about processes and threads.
  • PsLoggedOn - Show users logged on to a system.
  • PsLogList - Dump event log records.
  • PsPasswd - Changes account passwords.
  • PsService - View and control services.
  • PsShutdown - Shuts down and optionally reboots a computer.
  • PsSuspend - Suspend and resume processes.
  • RAMMap- An advanced physical memory usage analysis utility that presents usage information in different ways on its several different tabs.
  • RootkitRevealer - Scan your system for rootkit-based malware.
  • SDelete - Securely overwrite your sensitive files and cleanse your free space of previously deleted files using this DoD-compliant secure delete program.
  • TCPView - Active socket command-line viewer.
  • VMMap - VMMap is a process virtual and physical memory analysis utility.
  • WinObj - The ultimate Object Manager namespace viewer is here.
  • ZoomIt - Presentation utility for zooming and drawing on the screen.

These tools and the rest available on the SysInternals website are immensely important in troubleshooting. On top of the tools, books have been written on using the tools to troubleshoot at the system level in Windows. I have been lucky enough to attend a class during my years at Microsoft to learn how to use the tools to troubleshoot and determine crash root causes.

The best thing about these tools are two things:

  1. Some are updated at a pretty regular pace and almost all the tools when they need to be updated to work with Windows when updates or hotfixes make them not useful.
  2. The tools are available via their live access system at http://live.sysinternals.com and \\live.sysinternals.com\tools\. If you have access to the internet, the tools are always available through those access methods.

Being that the tools do update regularly, I built a scheduled task to keep a directory of the tools updated and synchronized. Here is the easy steps to create a job to keep a synced copy of all the tools on your computer.

  1. Open up Task Scheduler and create a Basic Task
  2. Give the task a name like "SysInternals Tools Sync" and click "Next"
  3. Select what triggers the update the tools. I chose daily myself. Click "Next".
  4. Enter in the details for the trigger and click "Next".
  5. Select "Start a Program" from the Action list and click "Next".
  6. Enter the following information then click “Next”
    • Program/Script - robocopy
    • Add Arguments - \\live.sysinternals.com\tools [[Destination Directory]] *.* /z /xo /xf thumbs.db /log:sysinternal.txt
    • Start in - Directory that the log can be written to. On Windows Vista and 7, writing to the root drive would require the task to run elevated.
  7. Check all of your settings and click "Finish"

You can add other actions like e-mail notification. I have my task e-mail me as a second action and the e-mail includes the log created by RoboCopy. Following this posting is the XML for importing in a task to be modified and used. It runs the RoboCopy to F:\ServerFolders\Tools at 6:08am daily and e-mails the log file saved in F:\TaskLogs. As you can tell, I run this on my Windows Home Server giving me the tools on my main file server for my network. I also run this same job on my computer at work. Give it a shot! I also plan to convert the job into PowerShell cause I want to.

<!--?</span-->xml version="1.0" encoding="UTF-16"?>
<Task version="1.2" xmlns="http://schemas.microsoft.com/windows/2004/02/mit/task">
 <RegistrationInfo>
 <Date>2011-05-07T12:09:24.4810632<!--</span-->Date>
 <Author>WHS\Administrator<!--</span-->Author>
 <!--</span-->RegistrationInfo>
 <Triggers>
 <CalendarTrigger>
 <StartBoundary>2011-05-08T06:08:00<!--</span-->StartBoundary>
 <Enabled>true<!--</span-->Enabled>
 <ScheduleByDay>
 <DaysInterval>1<!--</span-->DaysInterval>
 <!--</span-->ScheduleByDay>
 <!--</span-->CalendarTrigger>
 <!--</span-->Triggers>
 <Principals>
 <Principal id="Author">
 <UserId>WHS\Administrator<!--</span-->UserId>
 <LogonType>Password<!--</span-->LogonType>
 <RunLevel>LeastPrivilege<!--</span-->RunLevel>
 <!--</span-->Principal>
 <!--</span-->Principals>
 <Settings>
 <MultipleInstancesPolicy>IgnoreNew<!--</span-->MultipleInstancesPolicy>
 <DisallowStartIfOnBatteries>true<!--</span-->DisallowStartIfOnBatteries>
 <StopIfGoingOnBatteries>true<!--</span-->StopIfGoingOnBatteries>
 <AllowHardTerminate>true<!--</span-->AllowHardTerminate>
 <StartWhenAvailable>false<!--</span-->StartWhenAvailable>
 <RunOnlyIfNetworkAvailable>false<!--</span-->RunOnlyIfNetworkAvailable>
 <IdleSettings>
 <StopOnIdleEnd>true<!--</span-->StopOnIdleEnd>
 <RestartOnIdle>false<!--</span-->RestartOnIdle>
 <!--</span-->IdleSettings>
 <AllowStartOnDemand>true<!--</span-->AllowStartOnDemand>
 <Enabled>true<!--</span-->Enabled>
 <Hidden>false<!--</span-->Hidden>
 <RunOnlyIfIdle>false<!--</span-->RunOnlyIfIdle>
 <WakeToRun>false<!--</span-->WakeToRun>
 <ExecutionTimeLimit>PT2H<!--</span-->ExecutionTimeLimit>
 <Priority>7<!--</span-->Priority>
 <!--</span-->Settings>
 <Actions Context="Author">
 <Exec>
 <Command>robocopy<!--</span-->Command>
 <Arguments>\\live.sysinternals.com\tools F:\ServerFolders\Tools *.* /z /xo /xf thumbs.db /log:sysinternal.txt<!--</span-->Arguments>
 <WorkingDirectory>F:\TaskLogs<!--</span-->WorkingDirectory>
 <!--</span-->Exec>
 <SendEmail>
 <Server>exchange.j2ed.local<!--</span-->Server>
 <Subject>SysInternals Updated<!--</span-->Subject>
 <To>jareds@j2ed.local<!--</span-->To>
 <From>whs2011@j2ed.local<!--</span-->From>
 <Body>SysInternals have been updated. Log attached.<!--</span-->Body>
 <HeaderFields />
 <Attachments>
 <File>F:\TaskLogs\sysinternal.txt<!--</span-->File>
 <!--</span-->Attachments>
 <!--</span-->SendEmail>
 <!--</span-->Actions>
<!--</span-->Task>


Footnotes

1 – Information about SysInternals and Mark Russinovich found on http://en.wikipedia.org/wiki/Sysinternals

2 – List of tools is from http://technet.microsoft.com/en-us/sysinternals/bb545027

Read More
Microsoft Jared Shockley Microsoft Jared Shockley

Unannounced Automatic Updates … Are They Really Better For Users?

Another monthly "Patch Tuesday" just passed by and, like many folks, I updated all my systems to include these updates. I also awoke Wednesday and Thursday to machines that had been rebooted because Windows patched itself and rebooted the computer. In both cases, I was upset as I had open documents that I had not saved. To make this easier, many software packages are going to background updating with no notice to users. I question if this is really better for users.

Everyone knows about software updates thanks to Microsoft and their Windows/Microsoft Update system. Starting out with the "Windows Update" website with Windows 95, the updates were delivered via the internet. Later versions of Windows had more integration with the Update service to the point of the current incarnation of Windows Update built into Windows 7 and soon to be released Windows 8. Microsoft gives the user many levels of customization around the update process. One option, set as the recommended standard by Microsoft, is to install critical updates and reboot after the installation. This has caused issues for many users where the computer is rebooted without letting the user know. Users have complained about losing data .

http://answers.microsoft.com/en-us/windows/forum/windows_vista-system/windows-updates-automatically-and-causes-data-loss/b6978970-2f70-4cb0-9c68-c682903b40ff

http://answers.microsoft.com/en-us/windows/forum/windows_7-windows_update/data-loss-after-installation-of-windows-updates/755965e7-4872-402a-bcee-588700fcbe83?tab=MoreHelp

This has cause Microsoft to provide deep customizations around their updates to ensure no data loss.

Windows 8 changes this again. Having gone through a few months of patches with my Windows 8 installations, both Customer Preview and Release Preview, I prefer the new updater. Windows 8 performs the monthly patching using the Windows/Microsoft Update process as before. Users can customize this experience but the reboot is the key difference. Windows 8 gives the user notification that they should reboot within the next 3 days before it is automatically done. Finally, Microsoft is on the right path! The only thing better Microsoft can do is figure out to apply the updates without requiring reboots. As the Windows NT Core becomes more and more modular, this should be easier to do. Only the core elements would require the reboot while all subsystems could be restarted with new code.

Now, take a look at how Adobe, Mozilla and Google are doing their updates. Almost all of them have changed how they are doing their updates for their main products: Flash for Adobe, Firefox for Mozilla, and Chrome for Google. Their most current versions, as well as earlier versions of Chrome, are now setup to automatically download and install updates. If the default settings are used, all of them do this without notifying the user that there is a change. The only way to find the current version is to look in the package's "About this product" page or screen. I have not yet heard of issues with this process but a major concern is what happens if a bad release happens? Users would be confused as to why their computer wasn't working. A good example of this was Cisco's firmware update of Linksys E2700, E3500 and E4500 in late June. The update forced users to no longer use a local administrative system but a cloud-based system. There were issues with the cloud-based system and what information it tracked. With no other way to manage their routers, users are given no choice all cause by automatic updates. Cisco has reversed this but it is impacting their perception by users as many are not happy and some even returning their units.

As a manager of IT services, this concern is my biggest concern and makes me unwilling to support products that update automatically in the background. Within a managed environment, unannounced changes cause many problems. Microsoft created its monthly patching update cycle that is has around this design for enterprise environments. It is truly built around IT management systems. The updates are announced upon their delivery and allows IT teams to review and determine their risks for the organization. It also allows for testing cycles and deployment systems managed by the IT teams. The new unannounced automated updates do not allow for this.

With this movement to unannounced automated changes, some in the tech world think this change as the best thing for users. One argument is that it is good for developers as products keep improving, mentioning that it is similar to how web applications can be upgraded without user intervention. This is a bad comparison as web applications can be fully tested and conformed to "standards". Applications installed on a users' computer are more difficult. Did the software publisher check it in all configurations? This is much easier in controlled platforms like Apple's iOS and Mac OS X. With Microsoft's Windows platform and Linux based operating systems, this cannot be done easily. In one way, the fact that Microsoft can make Windows work on so many different configurations working with the hardware providers is absolutely amazing. I would suspect that Adobe, Mozilla and Google do not do this sort of in-depth testing.

I can see automatic unannounced updates for consumer users being a positive thing but personally do not like it at all. I have told Adobe to inform me of updates of Flash instead of just installing it. I am using a version of Firefox that does not have this automatic update when I need to use Firefox and have stayed on IE mostly for my personal use. To my dismay, Microsoft is now going to start performing automatic updates like Chrome and Firefox. My hope is that they offer a manage system for IT teams to control this process. Having worked at Microsoft, I wonder what the internal IT teams there think of this automatic update process.

Further automating the update process will make more users up-to-date and improve the overall security of the internet. Microsoft showed this with the move to the monthly patch process. Currently, statistics from security sources like Kaspersky Lab show a major shift in malware writers from attacking Windows directly to using other software as the attack vector, the most popular being Adobe Flash and Oracle/Sun Java. This opens up the malware folks to infecting more than just Windows, but Apple Mac and mobile devices like iOS and Google Android. The response to these threats is to do automated updates of those attack vectors. This helps users and increases security on the internet, but Microsoft has shown that a standard cadence can work. Adobe did try a standard cadence for updates to its products but has not been able to keep to a cadence due to the severity of their security issues being patched as of late. Instead of trying to make it work, they are moving to the models popularized by Google and, then, Mozilla.

The downside to all of this is the platform for upgrades. Every product seems to need to make its own product for monitoring for and applying new updates. Google and Mozilla both now install their own updater service that runs on the computer all the time and with administrative privileges. That is the only way for a service to run and install code without user intervention. My IT "spidey senses" go on high alert any time I hear this. Right now, on many home computers, there are most likely 5-10 updater services of some sort running. One solution is to have the operating system provide a standard mechanism for this sort of updating. Another is to use the task scheduling system of the operating system to schedule checks for updates. One great opportunity is the CoApp project headed up by Garrett Serack (@fearthecowboy) with many contributors. This could be a single updater that all the packages could use for their updates. Some sort of standardized and single point for updates would make users' systems run cleaner and happier.

The issue of unpatched systems on the internet is a major one for all of the computing world but especially for IT teams and their configuration management. In my review of ITIL/ITSM management philosophies, the configuration management part is the most critical aspect. Controlling change is how an IT team keeps a company running. It is the one area that most IT teams do not do well and it shows. If the push is to these unannounced automatic updates for web browsers and more companies use web tools to run their companies, how will they verify that all the web tools are going to work with each update? Will they see more Helpdesk calls from users confused when sites and tools don't work? What do you think?

Jared

Read More
Microsoft Jared Shockley Microsoft Jared Shockley

Office 2013 and the Art of the Announcement

Microsoft took its show on the road to California again, this time San Francisco instead of Los Angeles like the Surface announcement. Steve Ballmer got up in front of the media to talk about how their strategy is coming together and specifically to announce the public betas of Office 2013. While I will talk a bit about the technology, I want to mostly talk in this post about how Microsoft has setup both the Surface and Office 2013 announcements as they want to take a page from the great showmanship of Apple in these announcements.

First, let's talk about the product. Microsoft Office is the leading productivity package used by people around the world. First starting as just Word back in 1983, Microsoft expanded on Word with the purchase of Forethought in 1987; adding PowerPoint to their productivity software selection. The first complete package of Office as we know it today was released in 1995 as Office 95. It was followed closely with an upgrade in 1997 aptly called Office 97, code-named "Ren and Stimpy". The year of release name followed along in further releases such as Office 2000, 2003, 2007, and 2010. The only exception to this naming convention was Office XP released in 2001 to co-inside with the launch of Windows XP.[1]  What we see out of Microsoft today is a three year production cycle for Office, which is something that enterprise users can create a cadence with in either 3 or 6 year cycles.

With Office 2013, Microsoft is trying to simplify the user interface along with the ribbon. Changes include moving towards the "Metro" stylized ribbon through capitalizing the names, making the UI elements flatter, and simplifying looks and feel. The biggest irony for me is installing Office Professional Plus on my EP121 tablet running Windows 8 Release Preview as only had OneNote 2010 and Lync 2010 installed on here before. I use OneNote heavily and I love the new interface on the Desktop version in Windows 8. Hearing in the announcement that OneNote and Lync had Windows 8 Metro Experiences (aka MX) available, I went looking for them but did not see them anywhere. Thanks to Mary Jo Foley of ZDNet and All About Microsoft, these two apps will be available in the Windows Store shortly. But, even bigger thanks to fellow Krewe member Aubrey, I could install the Office 365 preview version of Office 2013 and get them installed now. Tempting for sure, but I will stay with this install and wait for the versions in the store. More information on the Office 2013 release is available from Win Super Site and All About Microsoft.

Now, here comes my complaints about the Apple-like showmanship with Surface and this announcement. First, I am an IT Manager and I have a day job. I blog because I like to write but it is not my job. My heart goes out to many that are journalists and I do not claim to know what they do. Having had drinks with a few and getting to know them, I do have some understanding of what they go through. Having said that and seen Paul Thurrott and Mary Jo Foley at MS TechEd in Orlando in June and watching Mary Jo Foley's coverage of the MS Worldwide Partner Conference in Toronto in July, why did Microsoft have separate events for these announcements? Much of the media that covers Microsoft was already at both events. Why make them setup last minute travel to sites that aren’t even standard spots for Microsoft?

Let's do a quick comparison between Microsoft and Apple. Microsoft has a large development conference called Build and they announced Windows 8 at it. Apple has their World-Wide Developer Conference in San Francisco each June and announce their new iOS or even Mac OS. Microsoft has its TechEd conference in June as well, a perfect time to make some announcements about certain technologies like Exchange, SharePoint, Lync but hold that until an Office announcement. Okay, I understand that and wanting to keep the message together. But then the next obvious place to announce is the WPC. Who best to announce this with as your "rabid audience" than partners looking to sell these solutions? But no, Microsoft does not take advantage of this "home turf". In fact, they do two separate events with little preparation forcing the tech media to jump if they want to cover it. This sounds so much like Apple and how they do their announcements through the year away from WWDC.

I like that Microsoft is jumping out and getting into the spotlight. I like that they are being mysterious and capable of it. The Microsoft I worked at was so full of holes, information leaked out like a sieve. The Surface announcement was a great notion but left me with so many questions. Why have it in LA and not Seattle? Why did you do an announcement of that type at a last minute event instead of using something like TechEd? Since you did not let the media be hands on, why not show it off at TechEd. Then, the media would not be the only ones in the room at the announcement. Surface would have gotten a huge standing ovation at TechEd with possible Seattle Sounder like chanting in the keynote. You leave many a TechEd fan and attendee crushed seeing this important announcement and not giving your most important user base no chance to look at the next step for Microsoft.

With the Office announcement, Microsoft did it again. They have a chance to announce something big at their major annual partner event but pass again. They have a chance to announce it on their home turf and make the tech press come to them, but pass on that again going to them in the Bay Area. What does this say? This smacks of letting the tech press say that the center of the tech world is San Francisco and the Silicon Valley. If you want to release something, you must come to us. With the size of both these messages and the baseline of support, Microsoft should either keep these announcements on their home turf or in front of their home crowd. That is what you learn from Apple in this case; Apple has its announcements in front of its developers. Use your home crowd for your announcements Microsoft and don't be scared to invite media to your events.

What do you think of Office 2013? What do you think of Microsoft's announcements and styles? Should they have kept them separate from the events or brought the announcements to their planned events? Leave a comment below.

Jared


Footnote:

1 – Office history courtesy of http://www.intowindows.com/microsoft-office-history-in-brief/

Read More
Microsoft Jared Shockley Microsoft Jared Shockley

The Future of Managing Microsoft Products - PowerShell

One key message that I got from both Microsoft TechEd's that I have attended was around how Microsoft plans for its products to be managed: PowerShell. Some might say that the "grey beards" have infiltrated Redmond and put a CLI management system into the products. With the planned release of PowerShell v3.0 in Windows Server 2012, PowerShell becomes the key skill for IT Professionals to learn. If they do not, they might be left behind.

I remember building Visual Basic scripts (VBScript) to execute scheduled jobs and perform maintenance tasks over the years. I built them on Windows Server 2000 and Server 2003. I also remember looking at Linux and its BASH environment for script and management of the systems. I loved the idea of a scripting language that was easy to remote execute and manage systems from, having more access to the system without need for add-ons, and ability to run live at the command line. VBScript did not offer this to me on Windows. Yes, I could build/buy/find COM objects to give me access to other parts of the system within VBScript, but I kept looking envious at BASH.

Fast forward to 2006 and the tooting of the PowerShell horns. PowerShell v1 was released for Windows XP/2003 series systems and included within Windows Vista. I played with it initially but it never caught on for me. I could see its future and hoped for more. Waiting for V2.0 of PowerShell that was released with Windows 7 and Windows Server 2008 R2, I started getting impressed with it. I dabbled more and more with it but it was still a dabble. I was confused by much of it and did not invest the time I should have. (Dates verified thanks to the WikiPedia article on Windows PowerShell)

Now, on the cusp of the Windows 8 and Server 2012 releases, I am re-energized by the thought of PowerShell again. In reviewing Microsoft's technologies and researching this piece on PowerShell, I learned one key thing: Microsoft is betting heavily on PowerShell. Many of their products management utilities are now just a GUI on top of PowerShell scripting. Some key examples of this is Exchange Server 2007 and 2010; Lync Server 2010; System Center 2007 (Virtual Machine Manager), 2010 and 2012; and SharePoint 2010. All of these GUI's execute PowerShell scripts behind the GUI. In many cases, most of the management consoles can do most of the management, but the GUI does not have all of the access that the PowerShell commandlets have. If an IT Professional utilized PowerShell at the command line, they could do more than the management console allowed.

With the advances of PowerShell v3, I am looking into how to learn more. There are a few good resources online such as:

TechNet - http://technet.microsoft.com/en-us/library/bb978526.aspx

PowerShell.com - http://powershell.com/

PowerShell Blog - http://blogs.msdn.com/b/powershell/

Scripting Guy - http://technet.microsoft.com/en-us/scriptcenter/bb410849.aspx

CodePlex - http://codeplex.com

Another way to learn more is attending conferences like TechEd or other educational opportunities. One person to look for as a trainer is Don Jones, a multi-year MVP in the PowerShell technology, as his sessions tend to fill quickly. Don also sells a book meant to learn PowerShell basics called "Learn Windows PowerShell in a Month of Lunches". My intent is to get this book and spend the my time learning PowerShell as I feel all IT Professionals should. I am also going to encourage my staff at my job to do so as well.

What do you think of PowerShell? Are you going to spend time learning it?

Jared

Read More