My week at TechEd 2013 in NOLA

Last week I attended my first TechEd. I am a Louisianaian and this was my first trip back since 2005 and my first trip to New Orleans (NOLA) since 1989. And the best part, I got to share all of the things I remember from my younger days with Jeff since he went with me. We had beignets and café au lait at Café du Monde, chateaubriand at Antoine’s, thin fried catfish and the ‘44’ from NOLA Hamburger and Seafood Company, and signature drinks from Bourbon Street. But I guess I better stick with the part that matters huh?

I must admit, that many people in one place is not really my normal way of spending a day much less a week. When I walked into the conference center on Monday morning, I could not believe how many people were there. The estimate was 10-12K in attendance. There were sessions that covered everything from SQL, to hacking, to debugging/troubleshooting, to PowerShell scripting. There were also sessions about the upcoming R2 releases of Windows Server 2012 and System Center 2012. Gabe wrote up a great blog post about his experience as well.

Before the keynote, the Treme Brass Band performed while the MANY participants were streaming into the room. They were great. The Keynote touched on the upcoming Win 8.1 release, changes to the Azure hosting prices, a really cool demo of SQL 2014, and a dev session (I was completely lost during the part).

Microsoft offered special pricing on the Surface RT and Surface Pro and allowed each participant to purchase 1 of each. The line was hours long each day except for Thursday (no line at all). All of us tech dweebs of course, waited in the long lines rather than wait until the final day. What if they ran out, or you just had to have it to play around with during your downtime. Even waiting in line was cool because of the networking you got to do. My group consisted of a sys admin from Trek (we talked about our love for PowerShell – he works remote like we do so he didn’t know my buddy that works at Trek), a support engineer that works for the San Antonio library system keeping the computer lab up and running at all the libraries in the city, a guy from Harley Davidson, and an MCT Visual Studio dev trainer. Quite a diverse group of people as you can see. I also recommended to  a couple techies from South Africa to take a Swamp tour on an air boat. That has to be an experience like they have never had before. Tried to find them later to see what they thought but didn’t have any luck. Hope the gators or the swamp didn’t swallow them.


Guess I need to get the time zone updated on mine – yes that is mine with the Steelers background.

I attended a couple PowerShell sessions, one was very basic intro stuff and the other was way over my head, and did multiple Hands-On labs. There were exam prep classes that were taught by MCTs and provided insight into the type questions that would be on the test and the best areas to focus on for studying. I also spent time in the Certification area because self-tests were free and all of the reference books were available. I ran into old colleagues from Lendingtree and also VHA. Chris from VHA is now very active in the PowerShell community and even worked the Scripting Guy booth. On Thursday, I successfully passed my upgrade test from Win 2008 and am now an MCSA: Windows Server 2012 and even got a ‘Certifiable’ t-shirt to prove my accomplishment. That was the most sought after shirt of the trip for me. I also went to the Microsoft store and grabbed a few onesies for my niece that will be here in a month or so. I am going to be the techie influence in this young ones life.


The closing event was held at the Superdome, home of the New Orleans Saints. The renovations they did following Hurricane Katrina are very nice. Drew Brees spoke for a couple minutes and then tossed the football around the VooDoo (Arena Football League) field to the kids of attendees. Imagine catching one of those!

All in all, it was a great event. I learned a lot. Met some cool people. Hung out with Gabe and watched him win all the good prizes.

Terri is a Support Specialist at OrcsWeb, a hosted server company providing managed hosting solutions.

TechEd 2013–Arrival

Being a native Louisianaian currently living in South Carolina, I was stoked to see that TechEd was in New Orleans (NOLA) this year. I was offered the opportunity to go and quickly jumped on it and began planning my trip. Yes, this would be a working trip but I was ensuring a day on each side for sight seeing and partaking of the food that I have greatly missed.

So on Sunday, 6/2, our adventure began. I was so happy that I chose a direct flight after I heard about the fun my Co-Worker Gabe had getting from Charlotte to NOLA. By 11AM, Jeff and I were in NOLA and checked in to our hotel, Bourbon Orleans. We caught the shuttle over so I could get registered and picked up all of my SWAG. I then decided to go and get my promotionally priced Surfaces or Surfi; not sure of the plural. Smile Registration started at 10AM so I figured I was early enough to beat the crowd, get in and out, and be ready to explore. Boy was I wrong! The line was already about a 2 hour wait. I decided to bail (we had until Thursday to get them so what was the rush). The rest of the day was spent getting settled in and scoping out the hood. I chose to hit Pat O’Briens first since the wait line is LONG later in the day. We ate a late lunch/early dinner (gumbo, ettouffe, red beans and rice, and jambalaya sampler)  and of course, I had a signature Hurricane. We then went back to the room so I could verify my schedule and sessions that I would be attending on Monday.

Stay tuned for TechEd 2013 – The week

Terri is a Support Specialist at OrcsWeb, a hosted server company providing managed hosting solutions.

Scripting Games 2013

So I decided to challenge myself and joined up for the Scripting Games 2013 edition. Last week I entered my first submission. Being that I am a novice at PowerShell, I was quite nervous about that first submission. All scripts are submitted anonymously, voted on by other participants/judges and a score (with comments or suggestions) is returned on your script. The scripts are graded on a scale of 1-5. I like the fact that comments on scoring are recommended for all values and required for 1s and 2s. This requires that the person scoring the script provide some feedback if a low score is given to support the grade and allows for suggestions for all other scores. It was nice to read the comments, for both good and bad scores, to see ways that I can improve my scripting skills.

In re-reading this before posting, I realized that it may sound like I totally bombed my first submission. I received a 3.28 score which solidifies my knowledge that I have lots to learn but also gives me hope since it wasn’t a 1.

Terri is a Support Specialist at OrcsWeb, a hosted server company providing managed hosting solutions.

Log Parser Lizard

I am sure I am not the only query challenged IT person out there. I admit it. There are other things I would prefer to learn/play around with so it is pretty low on my list to figure out.

Logparser is an excellent utility for querying text-based data. It is a command line based utility that uses SQL queries to parse logs for troubleshooting and information gathering. This can come in very handy when trying to locate IPs that are part of a brute force attack or even to see what pages are returning specific status codes such as 500 from IIS logs. Now as I mentioned earlier, I am query challenged so what takes my colleagues 10 minutes to setup, I am still fighting with hours later.

Enter Log Parser Lizard. This is a free to use GUI interface for Microsoft Logparser utility. Log Parser Lizard has quite a few built-in queries for the different types of text files that you can query. There are ‘buttons’ or folders for Active Directory, Event Logs, Facebook FQL, File System, IIS Logs, Log4Net, and T-SQL. My main use for the application is to parse IIS Logs. The default queries include the ability to query for file types, IP address ranges, ASP app errors, users, and HTTP Status codes to name a few. The Log Parser Lizard has much more information about the product so I would refer you there for additional information.

I hope you find this utility as helpful as I do.

Terri is a Support Specialist at OrcsWeb, a hosted server company providing managed hosting solutions.

SSL certificates for development sites on IIS7/7.5

On the topic of self-signed certificates, I wrote another blog post about SAN certificates that is over on my work blog. Jump over there and check it out. The post explains how to create a self-signed SAN certificate and then assign that certificate to multiple host header sites in IIS.

Terri is a Support Specialist at OrcsWeb, a hosted server company providing managed hosting solutions.

Self-Signed SSL certificates and IIS development site configuration using host header configurations for IIS7/7.5

One of our clients recently requested the ability to configure SSL for multiple development sites on a server with a single IP address. They had one certificate that was issued by an online CA for their production site and wanted self-signed certificates assigned to multiple development sites for testing purposes. In this walkthrough, I will provide information for creating a wildcard certificate that can be used for testing with any site in the same domain. Here is a blog, written by Scott Forsyth, which provides details about the domain which I will use in this walkthrough. For the certificate creation, PowerShell 3.0 is required. PowerShell 3.0 is part of the Windows Management Framework 3.0 package which can be downloaded here. If you are not able to install this on your server, you can create the certificate on a different machine and export it to a pfx file for importing onto your server.

Here is a blog post that I wrote previously that can be used to create multiple websites using PowerShell if you would like to experiment with this configuration. Once you have created your websites, you are ready to proceed through this post.

The cmdlet that we will use to create the self-signed wildcard is New-SelfSignedCertificate.

New-SelfSignedCertificate -DnsName, -CertStoreLocation cert:LocalMachineMy

.csharpcode, .csharpcode pre
font-size: small;
color: black;
font-family: consolas, “Courier New”, courier, monospace;
background-color: #ffffff;
/*white-space: pre;*/
.csharpcode pre { margin: 0em; }
.csharpcode .rem { color: #008000; }
.csharpcode .kwrd { color: #0000ff; }
.csharpcode .str { color: #006080; }
.csharpcode .op { color: #0000c0; }
.csharpcode .preproc { color: #cc6633; }
.csharpcode .asp { background-color: #ffff00; }
.csharpcode .html { color: #800000; }
.csharpcode .attr { color: #ff0000; }
.csharpcode .alt
background-color: #f4f4f4;
width: 100%;
margin: 0em;
.csharpcode .lnum { color: #606060; }

The exact command that I ran for this walkthrough is ‘New-SelfSignedCertificate -DnsName * -CertStoreLocation cert:LocalMachineMy’. This created a self-signed certificate in my local machine store.


Since this certificate is created in the Personal store of the Local Machine, you can export and import it into the Trusted Root Certificate store so that it will be trusted by IIS. If you are planning to test these sites from a different machine than hosts the website, you can also import the certificate into the Trusted Root Certificate store on your workstation and you will not receive any certificate warning errors when testing.

You are now ready to open IIS Manager and assign your newly created certificate to your websites. In order to enable the GUI host header field within the https bindings, the friendly name of your certificate has to be * Since we created the certificate as a wildcard certificate, we do not have to make any modifications to the friendly name.

Open IIS Manager, select the website that you want to add the SSL certificate to, and open Bindings from the Action pane.


Click Add and change the Type to https. You will notice that the Host name: field is greyed out and cannot be edited.


Once you select your certificate (*, this field will be editable, as seen below.


Enter your host header name in the Host name: box and click OK. You can also add this information using appcmd with the following syntax (replace name with your website name):

appcmd set site /”name” /+bindings.[protocol=’https’,bindinginformation=’*.443:name]

If you used the domain for this walkthrough, you are now ready to test your site without having to create DNS or local host file entries.

You are now on your way to happy development testing without pesky SSL warnings interrupting the flow.

Terri is a Support Specialist at OrcsWeb, a hosted server company providing managed hosting solutions.

Hosted Email

I host this blog on Cytanium. If you have not checked them out, you really should. Cytanium does not offer hosted email support so yesterday Jessica wrote a great blog about setting up a hosted email account on that uses your domain name. I walked through the steps and now have to go along with my online web presence. Thanks for the easy to follow post Jessica.

Terri is a Support Specialist at OrcsWeb, a hosted server company providing managed hosting solutions.

Long TXT records–over 255 characters

For the record, let me state that I am not a DNS expert and this may seem simple for someone that is. But since I could not find anything to help me resolve this problem, I figured I would blog about it to possibly save someone else the headache that I encountered.

The other day I was working on an SPF record for a client that was over 255 characters. This is the limitation for TXT records in DNS for each string. I spent quite a while working on this and tried every iteration I could think of to get it to work. I even found quite a few articles that talked about breaking the records into multiple strings but could not get that to work. So I settled for creating a new subdomain within the domain and adding part of the text to an SPF record in the subdomain. I then used the include:subdomain feature within the record. This worked perfectly as a way to get around the issue. The client, however, did not like this solution. So back to the drawing board.

I am a literal person. If it is a string, I make it a string by using quotes. This is also how the examples showed it. That, needless to say, is not what needs to be done. Long story short, if you need to break an SPF record into multiple strings, here is how it is done.

  1. Create the new TXT record in DNS Manager
  2. Add the first part of the record (up to but not exceeding 255 characters)
  3. Hit enter within the record to start a new line.
  4. Add a space and then the next portion of the record up to but not exceeding 254 characters.
  5. Repeat step 4 until all data is in the record.
  6. Save the record

You can then use a site like MXToolbox to verify the SPF record. If the record is correct, you will receive output that looks like this:


If any portion of the SPF record is incorrect, you will get an error and additional text which points to the problem with the record. If the record contains any string that is over 255 characters, you will receive an empty response.

At the end of the day, the client was happy and I learned something as well.

Terri is a Support Specialist at OrcsWeb, a hosted server company providing managed hosting solutions.

Creating AD users in bulk with PowerShell

The other day I was lurking on the PowerShell forum and found a question about importing an Excel spreadsheet to use for AD user account creation. It looked like a quick fix so I decided to give it a go. I am new to PowerShell so real world ideas like this one provide a great way for me to learn while also helping out in the community. Needless to say, I found out that it wasn’t the quick fix that I thought it would be.

I decided to use a CSV file as the source rather than Excel since I was working on a server that did not have Excel installed and pretty quickly got the script to work using the ActiveDirectory module provided by Microsoft. When I went to verify the results, however, all of the programmatically created users were disabled. What good is a ‘working’ script if the output doesn’t provide the required functionality. All of the accounts were created, but, I was unable to even manually enable the account due to an error that the password did not meet my domains complexity requirements. I verified that the password used in the script, was actually a valid password. I could reset the user’s password to the one in the csv file and enable the AD account without any errors. This pointed to an issue with the way I was setting the password in the script. I googled the issue and low and behold there was a blog written about this exact issue.

After integrating the code snippet from the above blog post, I was able to successfully create enabled and functional AD users. Here is the script and a sample CSV file that can be used as a starting point. Since there are so many fields that can be set for an AD user, I created a very small sample but this can be expanded to include any attributes that are required by your organization.

   1: # CreateADUsers.ps1

   2: Set-ExecutionPolicy Unrestricted

   3: Import-Module ActiveDirectory

   4: $csvpath = "c:scriptsNewusers.csv"

   5: $date = Get-Date

   6: $logfile = "c:scriptscreate_AD_users.log"

   7: $i=0


   9: # Specify parent container for all new users.

  10: $OU =  "OU=UsersOU,DC=domain,DC=com"


  12: Import-Csv $csvpath |  ForEach-Object {

  13: $sam = $_.Username

  14:     Try   { $exists = Get-ADUser -LDAPFilter "(sAMAccountName=$sam)" }

  15:     Catch { }

  16:     If(!$exists)

  17:     {

  18:     $Password = $_.Password

  19: New-ADUser $sam -GivenName $_.GivenName -Initials $_.Initials -Surname $_.SN -DisplayName $_.DisplayName -EmailAddress $_.EmailAddress  -passthru |

  20: ForEach-Object {

  21: $_ | Set-ADAccountPassword -Reset -NewPassword (ConvertTo-SecureString -AsPlainText $Password -Force)

  22: $_ | Enable-ADAccount }


  24: # Set an ExtensionAttribute

  25:       $dn  = (Get-ADUser $sam).DistinguishedName

  26:       $ext = [ADSI]"LDAP://$dn"

  27:       $ext.SetInfo()

  28:       Move-ADObject -Identity $dn -TargetPath $OU


  30:       $newdn = (Get-ADUser $sam).DistinguishedName

  31:       Rename-ADObject -Identity $newdn -NewName $_.DisplayName


  33:       $output  = $i.ToString() + ") Name: " + $_.UserName + "  sAMAccountName: "

  34:       $output += $sam + "  Pass: " + $_.Password

  35:       $output | Out-File $logfile -append

  36:      }

  37:      Else

  38:      {

  39:      "SKIPPED - ALREADY EXISTS OR ERROR: " + $_.CN | Out-File $logfile -append

  40: }

  41: "----------------------------------------" + "`n" | Out-File $logfile -append

  42: }

This is the sample CSV (newusers.csv) data that I used in testing the script.

“Susan”,”SU”,”User”,”Susan User”,””,”susan”,”~RP:hoV.ZmE4tS6Z”
“James”,”JU”,”User”,”James User”,””,”james”,”~RP:hoV.ZmE4tS6Z”
“Ronnie”,”RU”,”User”,”Ronnie User”,””,”ronnie”,”~RP:hoV.ZmE4tS6Z”

I hope you find this script useful and it saves you time when needing to create bulk AD users in your production or test environments.

Terri is a Support Specialist at OrcsWeb, a hosted server company providing managed hosting solutions.

Win8 and OneNote

Recently I upgraded my work laptop to Windows 8. It has been a definite learning curve but so far so good. We use OneNote to share information between teams and I could successfully add a notebook but after closing OneNote and opening it again, I would be unable to authenticate to open the notebook again. If I closed the notebook, I could add it back, authenticate and work again. My personal notebook worked fine and would sync without any issues. After a few days of this, I decided to figure out what was causing the issue.

In troubleshooting this issue, I saw that there were 2 versions of OneNote listed in the Open with application box. One of the choices was OneNote and the other was OneNote (desktop). This led me down the trail of seeing if I had the Win8 OneNote application installed. I did. When I opened it, I noticed that it was not authenticated. Since I do not have a need to have both the desktop version and the app installed/available, I removed the app and authentication for the shared notebook started working as expected. Hopefully this will save you the headache that I encountered when troubleshooting this issue if you are a OneNote lover like I am.