60days2MCSE Jared Shockley 60days2MCSE Jared Shockley

Update on my 60 Days to MCSE

Wow, I have not posted on this topic for a very, very long time. I am writing an update around my path towards MCSE: Private Cloud and the mis-steps I took on the way. As it stands, I am way past 60 days from my start and I haven't even gotten out of the blocks.

Back in April 2012, a group of IT Professionals started our way to the new Microsoft Certified Solutions Expert for Private Cloud (MCSE:PC). It was Microsoft's return of the MCSE certification after it was stopped at one point with the move to Microsoft Certified Information Technology Professional (MCITP) with the Windows Server 2008 certifications. The old Microsoft Certified Systems Engineer is now the Microsoft Certified Solutions Expert, removing the word "Engineer" from the title in speculation to remove problems with governments or licensing boards around the world around the use of the title Engineer.1 Many of us that had some or none of the certificates needed to get this new MSCE:PC were going to set ourselves up to get this new certificate by MS TechEd in Orlando in June.

In my case, my eyes were a lot bigger than my "stomach" for this process. Early on, I had many impacts on my time cause by my job. One thing to understand is that I work in IT Management and not in a technical position per se. This means my time is spent in meetings, maintaining projects and working through administrative items. As such, technical certifications is something done on my own time, not a direct responsibility of my job. As my time was burned by work and other projects I work on outside of work, it did not leave me much time to work on my certification.

The next big impact on me not furthering myself down the certification path is the complexity of the certifications. An example is a friend of mine, Peter Gray, who is an Active Directory engineer as his day job. This should make the 70-640, Windows Server 2008 Active Directory, Configuring, a slam dunk for him. Come to find out, he did not pass his 70-640 test. He explains all that he learned in his testing preparation and taking on his own blog but it made me think this specifically:

He does this as a daily job and did not pass. I am out of the technical aspects most of the time because my role is more administrative. I can't just jump in and pass the test.

Microsoft has done a lot of changes to its certification testing to remove the possibility of "paper experts" that can pass the tests but can't apply that directly in real life. This is what I felt I was up against early in my career where I had practical, hands-on understanding while paper experts were getting hired because they had initials and I did not.

WP_000377Given these things, I focused on three tests in particular, the 70-640, 70-246, and 70-247, and my plan was to take all three at MS TechEd. I worked through the great TrainSignal training provided to the 60Days2MCSE group around the 70-640 test and built more on my home Windows 2008 R2 AD infrastructure. On top of that planning, I tried to setup some labs around the SCCM environment to prepare for the 246 and 247 tests. It was difficult to get through but I did get through one minor pass. Lastly, I sat in on a 70-640 test prep session at MS TechEd before I took the test and went into the lab provided to do practice tests.

After all of that preparation, I took the 70-246 and 70-247 free exams and the 70-640 that I paid for. In all three cases, I did not attain the proper scores to pass the exams. While not passing the free exams was a bit of a blow, failing the 70-640 was a tougher one to go through. I had hoped with the prep work I had done that I would have passed.

Now I find myself in September, work has taken over much of my time again and I find myself returning to my personal projects such as my blog and certifications. I have reflected on the process and will be looking at getting my certifications still. I will work through them differently than before and not think that I will just get them immediately as I thought I was before.

What sort of certification stories do you have? Have you worked through certifications but not passed like I did? If so, what did you do to get yourself situation to pass?

--------

NOTES

1 - The editorial at http://certcities.com/editorial/news/story.asp?EditorialsID=132&page=2 discusses the change of the title to remove Engineer in an effort to appease licensing boards in Canada. This has been speculated in conversations with folks. At this time, I do not know if it was ever confirmed by Microsoft.

Read More
Microsoft Jared Shockley Microsoft Jared Shockley

I passed a Microsoft Beta Exam!

It was the strangest notification I have ever seen in my e-mail on Friday morning. I heard from Microsoft Learning inviting me to the Microsoft Certified Professional website. I was given codes to attach my Microsoft Account to the MCP site and create an account there. This came out of the blue and I tried to remember back to the tests I have taken. Then, it dawned on me … I passed a beta test for Windows 8 Configuration.

I sat at my desk in awe. I had heard from others that took the beta tests at North American TechEd that results still had not been posted. Some were getting a bit "testy" about it. Folks from MS Learning was working with them to get the scoring from the testing company but I totally forgot about checking my own scores. It literally took me a few minutes to realize what had happened and that I passed the exam. I will be honest, with my test scores from TechEd, I did not expect to pass. I learned so much about the certification exam taking process and preparation that I did not hold out any expectations.

After getting my code entered, I was informed on the MCP website that I now hold a certificate in Windows 8 Configuration. I am eager to finish off the MCSA: Windows 8 full certification as soon as I can. As I stated before in this blog, I did not work for my certifications before. Now, having been to a couple of TechEd conferences and getting to know others IT Professionals and Trainers, I fully see the value and started work on my own certification. This first one under my belt may be a basic one but it feels good to pass a test after failing a few at TechEd. I am more understanding of what this process entails and will be moving forward with more certifications, both technology and process.

What sort of certifications have you received and are trying to attain? Are you looking at technology, process, or management certifications? What value do you find in certifications?

Read More
Privacy Jared Shockley Privacy Jared Shockley

C3PI? CSI? CNN?… or … How cell phone companies are trying to make more on us!

I am an avid listener of Security Now! from the TWiT netcast network. Steve Gibson of Gibson Research Corporation has a deep understanding of security down to the basic systems and even components to explain it. If it says anything, they just released their 373rd episode yesterday, October 10, 2012. This past episode was one of his Q&A shows. In it, Russell in London wrote in to explain something that he saw for Verizon Wireless users.1  What Steve explained was how CPNI (Customer Proprietary Network Information) was going to be sold by Verizon to marketers unless you opt-out in the next 30 days. My reaction was "What? Is this something I need to worry about on another carrier?" The answer is an emphatic "YES" if you care about your privacy.

If we go back to my background, I am not a fan of Google products. The main reason is that they are an advertising company, not some "do no evil to every little thing" company. They are here to sell advertising. To best sell that advertising space to other companies, it needs to know about its audience. That is where the sheep Google users come in. They give Google all the information about themselves that they can. Every search, every e-mail, every bit of documents they write … all of that is congealed to create audience profiles. Then, then can say who the companies buying their advertising is going to market to.

The theory of targeted marketing is a great one. I don’t care to see ads for feminine hygiene products. I also don't care to see ads around baby products. These are two areas I just have no need to see. Give me guns, gadgets and girls. (Reminder: Insert legal clip of Tim Allen doing his monkey grunt here.) The problem is to do the targeted marketing in this current implementation, I need to give all of my information to some company to do that filtering. That means giving all of my information, every little bit, to a company and trusting them to "do no evil". There is an inherit problem with this: companies do what they must to keep their stockholders happy. This basic information about a company is why corporate banks do things differently than a credit union. This is why large corporate ISP's do things differently than the mom and pop ISP down the street.

How does this all relate to this post Jared? Well, cell phone companies saw an opportunity. They have all of our phone, text and mobile internet usage information gathered up to gouge bill us appropriately for our services. Thanks to the keen eyes on Russell in London, it appears at least one is going to try and sell our information to marketing companies. So, they get us coming and going making every piece of profit they can on us, their user. Sound familiar?

If you are one to care about your privacy, here are links to both Verizon's and AT&T's opt-out program for CPNI transfer. If you are on another carrier, do a search in your favorite search engine for the carrier name and "CPNI opt out".

What are your feelings about carriers selling your information? Are you okay with it? What about other companies that you buy services and goods from?


Footnotes:

1 – Question number 5 in Security Now!’s episode 373. Show notes are available at http://wiki.twit.tv/wiki/Security_Now_373

Read More
Microsoft Jared Shockley Microsoft Jared Shockley

Like SysInternals Tools? Want to Keep Them Automatically Updated?

SysInternalsLogoIf you are an IT Professional and have not heard of the SysInternals tools, you need to get out of the server room more often. SysInternals tools, originally NTInternals and then WinInternals, were built by a company headed by Mark Russinovich and Bryce Cogswell. Their company, started in 1996, offer many tools for IT professionals. In 2006, Microsoft purchased the company and all of its assets including bringing Mark Russinovich to Redmond to join Microsoft.[1] These tools can help IT Professionals and developers understand what is happening when things go wrong.

I want to briefly cover the tools available as they can be forgotten and yet be so amazingly helpful when trying to figure out the problems that come up. Some of my personal favorites are: [2]

  • AdExplorer - Active Directory Explorer is an advanced Active Directory (AD) viewer and editor.
  • Autoruns - See what programs are configured to startup automatically when your system boots and you login. Autoruns also shows you the full list of Registry and file locations where applications can configure auto-start settings.
  • BgInfo - This fully-configurable program automatically generates desktop backgrounds that include important information about the system including IP addresses, computer name, network adapters, and more.
  • BlueScreen- This screen saver not only accurately simulates Blue Screens, but simulated reboots as well (complete with CHKDSK), and works on Windows NT 4, Windows 2000, Windows XP, Server 2003 and Windows 9x.
  • Coreinfo - Coreinfo is a new command-line utility that shows you the mapping between logical processors and the physical processor, NUMA node, and socket on which they reside, as well as the cache’s assigned to each logical processor.
  • Disk2vhd - Disk2vhd simplifies the migration of physical systems into virtual machines (p2v).
  • Diskmon - This utility captures all hard disk activity or acts like a software disk activity light in your system tray.
  • DiskView - Graphical disk sector utility.
  • ListDLLs - List all the DLLs that are currently loaded, including where they are loaded and their version numbers.
  • ProcDump - This command-line utility is aimed at capturing process dumps of otherwise difficult to isolate and reproduce CPU spikes. It also serves as a general process dump creation utility and can also monitor and generate process dumps when a process has a hung window or unhandled exception.
  • Process Explorer - Find out what files, registry keys and other objects processes have open, which DLLs they have loaded, and more. This uniquely powerful utility will even show you who owns each process.
  • Process Monitor - Monitor file system, Registry, process, thread and DLL activity in real-time.
  • PsExec - Execute processes on remote systems.
  • PsFile - See what files are opened remotely.
  • PsGetSid - Displays the SID of a computer or a user.
  • PsInfo - Obtain information about a system.
  • PsKill - Terminate local or remote processes.
  • PsList - Show information about processes and threads.
  • PsLoggedOn - Show users logged on to a system.
  • PsLogList - Dump event log records.
  • PsPasswd - Changes account passwords.
  • PsService - View and control services.
  • PsShutdown - Shuts down and optionally reboots a computer.
  • PsSuspend - Suspend and resume processes.
  • RAMMap- An advanced physical memory usage analysis utility that presents usage information in different ways on its several different tabs.
  • RootkitRevealer - Scan your system for rootkit-based malware.
  • SDelete - Securely overwrite your sensitive files and cleanse your free space of previously deleted files using this DoD-compliant secure delete program.
  • TCPView - Active socket command-line viewer.
  • VMMap - VMMap is a process virtual and physical memory analysis utility.
  • WinObj - The ultimate Object Manager namespace viewer is here.
  • ZoomIt - Presentation utility for zooming and drawing on the screen.

These tools and the rest available on the SysInternals website are immensely important in troubleshooting. On top of the tools, books have been written on using the tools to troubleshoot at the system level in Windows. I have been lucky enough to attend a class during my years at Microsoft to learn how to use the tools to troubleshoot and determine crash root causes.

The best thing about these tools are two things:

  1. Some are updated at a pretty regular pace and almost all the tools when they need to be updated to work with Windows when updates or hotfixes make them not useful.
  2. The tools are available via their live access system at http://live.sysinternals.com and \\live.sysinternals.com\tools\. If you have access to the internet, the tools are always available through those access methods.

Being that the tools do update regularly, I built a scheduled task to keep a directory of the tools updated and synchronized. Here is the easy steps to create a job to keep a synced copy of all the tools on your computer.

  1. Open up Task Scheduler and create a Basic Task
  2. Give the task a name like "SysInternals Tools Sync" and click "Next"
  3. Select what triggers the update the tools. I chose daily myself. Click "Next".
  4. Enter in the details for the trigger and click "Next".
  5. Select "Start a Program" from the Action list and click "Next".
  6. Enter the following information then click “Next”
    • Program/Script - robocopy
    • Add Arguments - \\live.sysinternals.com\tools [[Destination Directory]] *.* /z /xo /xf thumbs.db /log:sysinternal.txt
    • Start in - Directory that the log can be written to. On Windows Vista and 7, writing to the root drive would require the task to run elevated.
  7. Check all of your settings and click "Finish"

You can add other actions like e-mail notification. I have my task e-mail me as a second action and the e-mail includes the log created by RoboCopy. Following this posting is the XML for importing in a task to be modified and used. It runs the RoboCopy to F:\ServerFolders\Tools at 6:08am daily and e-mails the log file saved in F:\TaskLogs. As you can tell, I run this on my Windows Home Server giving me the tools on my main file server for my network. I also run this same job on my computer at work. Give it a shot! I also plan to convert the job into PowerShell cause I want to.

<!--?</span-->xml version="1.0" encoding="UTF-16"?>
<Task version="1.2" xmlns="http://schemas.microsoft.com/windows/2004/02/mit/task">
 <RegistrationInfo>
 <Date>2011-05-07T12:09:24.4810632<!--</span-->Date>
 <Author>WHS\Administrator<!--</span-->Author>
 <!--</span-->RegistrationInfo>
 <Triggers>
 <CalendarTrigger>
 <StartBoundary>2011-05-08T06:08:00<!--</span-->StartBoundary>
 <Enabled>true<!--</span-->Enabled>
 <ScheduleByDay>
 <DaysInterval>1<!--</span-->DaysInterval>
 <!--</span-->ScheduleByDay>
 <!--</span-->CalendarTrigger>
 <!--</span-->Triggers>
 <Principals>
 <Principal id="Author">
 <UserId>WHS\Administrator<!--</span-->UserId>
 <LogonType>Password<!--</span-->LogonType>
 <RunLevel>LeastPrivilege<!--</span-->RunLevel>
 <!--</span-->Principal>
 <!--</span-->Principals>
 <Settings>
 <MultipleInstancesPolicy>IgnoreNew<!--</span-->MultipleInstancesPolicy>
 <DisallowStartIfOnBatteries>true<!--</span-->DisallowStartIfOnBatteries>
 <StopIfGoingOnBatteries>true<!--</span-->StopIfGoingOnBatteries>
 <AllowHardTerminate>true<!--</span-->AllowHardTerminate>
 <StartWhenAvailable>false<!--</span-->StartWhenAvailable>
 <RunOnlyIfNetworkAvailable>false<!--</span-->RunOnlyIfNetworkAvailable>
 <IdleSettings>
 <StopOnIdleEnd>true<!--</span-->StopOnIdleEnd>
 <RestartOnIdle>false<!--</span-->RestartOnIdle>
 <!--</span-->IdleSettings>
 <AllowStartOnDemand>true<!--</span-->AllowStartOnDemand>
 <Enabled>true<!--</span-->Enabled>
 <Hidden>false<!--</span-->Hidden>
 <RunOnlyIfIdle>false<!--</span-->RunOnlyIfIdle>
 <WakeToRun>false<!--</span-->WakeToRun>
 <ExecutionTimeLimit>PT2H<!--</span-->ExecutionTimeLimit>
 <Priority>7<!--</span-->Priority>
 <!--</span-->Settings>
 <Actions Context="Author">
 <Exec>
 <Command>robocopy<!--</span-->Command>
 <Arguments>\\live.sysinternals.com\tools F:\ServerFolders\Tools *.* /z /xo /xf thumbs.db /log:sysinternal.txt<!--</span-->Arguments>
 <WorkingDirectory>F:\TaskLogs<!--</span-->WorkingDirectory>
 <!--</span-->Exec>
 <SendEmail>
 <Server>exchange.j2ed.local<!--</span-->Server>
 <Subject>SysInternals Updated<!--</span-->Subject>
 <To>jareds@j2ed.local<!--</span-->To>
 <From>whs2011@j2ed.local<!--</span-->From>
 <Body>SysInternals have been updated. Log attached.<!--</span-->Body>
 <HeaderFields />
 <Attachments>
 <File>F:\TaskLogs\sysinternal.txt<!--</span-->File>
 <!--</span-->Attachments>
 <!--</span-->SendEmail>
 <!--</span-->Actions>
<!--</span-->Task>


Footnotes

1 – Information about SysInternals and Mark Russinovich found on http://en.wikipedia.org/wiki/Sysinternals

2 – List of tools is from http://technet.microsoft.com/en-us/sysinternals/bb545027

Read More
Microsoft Jared Shockley Microsoft Jared Shockley

Unannounced Automatic Updates … Are They Really Better For Users?

Another monthly "Patch Tuesday" just passed by and, like many folks, I updated all my systems to include these updates. I also awoke Wednesday and Thursday to machines that had been rebooted because Windows patched itself and rebooted the computer. In both cases, I was upset as I had open documents that I had not saved. To make this easier, many software packages are going to background updating with no notice to users. I question if this is really better for users.

Everyone knows about software updates thanks to Microsoft and their Windows/Microsoft Update system. Starting out with the "Windows Update" website with Windows 95, the updates were delivered via the internet. Later versions of Windows had more integration with the Update service to the point of the current incarnation of Windows Update built into Windows 7 and soon to be released Windows 8. Microsoft gives the user many levels of customization around the update process. One option, set as the recommended standard by Microsoft, is to install critical updates and reboot after the installation. This has caused issues for many users where the computer is rebooted without letting the user know. Users have complained about losing data .

http://answers.microsoft.com/en-us/windows/forum/windows_vista-system/windows-updates-automatically-and-causes-data-loss/b6978970-2f70-4cb0-9c68-c682903b40ff

http://answers.microsoft.com/en-us/windows/forum/windows_7-windows_update/data-loss-after-installation-of-windows-updates/755965e7-4872-402a-bcee-588700fcbe83?tab=MoreHelp

This has cause Microsoft to provide deep customizations around their updates to ensure no data loss.

Windows 8 changes this again. Having gone through a few months of patches with my Windows 8 installations, both Customer Preview and Release Preview, I prefer the new updater. Windows 8 performs the monthly patching using the Windows/Microsoft Update process as before. Users can customize this experience but the reboot is the key difference. Windows 8 gives the user notification that they should reboot within the next 3 days before it is automatically done. Finally, Microsoft is on the right path! The only thing better Microsoft can do is figure out to apply the updates without requiring reboots. As the Windows NT Core becomes more and more modular, this should be easier to do. Only the core elements would require the reboot while all subsystems could be restarted with new code.

Now, take a look at how Adobe, Mozilla and Google are doing their updates. Almost all of them have changed how they are doing their updates for their main products: Flash for Adobe, Firefox for Mozilla, and Chrome for Google. Their most current versions, as well as earlier versions of Chrome, are now setup to automatically download and install updates. If the default settings are used, all of them do this without notifying the user that there is a change. The only way to find the current version is to look in the package's "About this product" page or screen. I have not yet heard of issues with this process but a major concern is what happens if a bad release happens? Users would be confused as to why their computer wasn't working. A good example of this was Cisco's firmware update of Linksys E2700, E3500 and E4500 in late June. The update forced users to no longer use a local administrative system but a cloud-based system. There were issues with the cloud-based system and what information it tracked. With no other way to manage their routers, users are given no choice all cause by automatic updates. Cisco has reversed this but it is impacting their perception by users as many are not happy and some even returning their units.

As a manager of IT services, this concern is my biggest concern and makes me unwilling to support products that update automatically in the background. Within a managed environment, unannounced changes cause many problems. Microsoft created its monthly patching update cycle that is has around this design for enterprise environments. It is truly built around IT management systems. The updates are announced upon their delivery and allows IT teams to review and determine their risks for the organization. It also allows for testing cycles and deployment systems managed by the IT teams. The new unannounced automated updates do not allow for this.

With this movement to unannounced automated changes, some in the tech world think this change as the best thing for users. One argument is that it is good for developers as products keep improving, mentioning that it is similar to how web applications can be upgraded without user intervention. This is a bad comparison as web applications can be fully tested and conformed to "standards". Applications installed on a users' computer are more difficult. Did the software publisher check it in all configurations? This is much easier in controlled platforms like Apple's iOS and Mac OS X. With Microsoft's Windows platform and Linux based operating systems, this cannot be done easily. In one way, the fact that Microsoft can make Windows work on so many different configurations working with the hardware providers is absolutely amazing. I would suspect that Adobe, Mozilla and Google do not do this sort of in-depth testing.

I can see automatic unannounced updates for consumer users being a positive thing but personally do not like it at all. I have told Adobe to inform me of updates of Flash instead of just installing it. I am using a version of Firefox that does not have this automatic update when I need to use Firefox and have stayed on IE mostly for my personal use. To my dismay, Microsoft is now going to start performing automatic updates like Chrome and Firefox. My hope is that they offer a manage system for IT teams to control this process. Having worked at Microsoft, I wonder what the internal IT teams there think of this automatic update process.

Further automating the update process will make more users up-to-date and improve the overall security of the internet. Microsoft showed this with the move to the monthly patch process. Currently, statistics from security sources like Kaspersky Lab show a major shift in malware writers from attacking Windows directly to using other software as the attack vector, the most popular being Adobe Flash and Oracle/Sun Java. This opens up the malware folks to infecting more than just Windows, but Apple Mac and mobile devices like iOS and Google Android. The response to these threats is to do automated updates of those attack vectors. This helps users and increases security on the internet, but Microsoft has shown that a standard cadence can work. Adobe did try a standard cadence for updates to its products but has not been able to keep to a cadence due to the severity of their security issues being patched as of late. Instead of trying to make it work, they are moving to the models popularized by Google and, then, Mozilla.

The downside to all of this is the platform for upgrades. Every product seems to need to make its own product for monitoring for and applying new updates. Google and Mozilla both now install their own updater service that runs on the computer all the time and with administrative privileges. That is the only way for a service to run and install code without user intervention. My IT "spidey senses" go on high alert any time I hear this. Right now, on many home computers, there are most likely 5-10 updater services of some sort running. One solution is to have the operating system provide a standard mechanism for this sort of updating. Another is to use the task scheduling system of the operating system to schedule checks for updates. One great opportunity is the CoApp project headed up by Garrett Serack (@fearthecowboy) with many contributors. This could be a single updater that all the packages could use for their updates. Some sort of standardized and single point for updates would make users' systems run cleaner and happier.

The issue of unpatched systems on the internet is a major one for all of the computing world but especially for IT teams and their configuration management. In my review of ITIL/ITSM management philosophies, the configuration management part is the most critical aspect. Controlling change is how an IT team keeps a company running. It is the one area that most IT teams do not do well and it shows. If the push is to these unannounced automatic updates for web browsers and more companies use web tools to run their companies, how will they verify that all the web tools are going to work with each update? Will they see more Helpdesk calls from users confused when sites and tools don't work? What do you think?

Jared

Read More