• Windows 2008 Performance Alerts

    This may seem silly to some of you, but I am still getting used to Windows 2008.  Sadly, I don’t spend as much time actually administering servers as I used to (silly management), so it usually takes me a bit longer to make my way around 2008 than 2003.  I like to think they made everything more complex, but for some reason I’m sure I’ll get booed about that.

    Anyways, this morning I was attempting to setup some performance alerts on some servers we’re having issues with.  Basically I wanted to have it email us when it reached a certain threshold.  No big deal, thinking I had this, I created the email app, created a performance counter, and then manually added it in.

    Needless to say that didn’t work.  It took me awhile to figure out why too as my little email utility worked fine.  So I began a new search in order to find out how stupid I was being.

    Turns out, quite a lot of stupid.  Instead of using the utility, you can now use scheduled task items…which includes an email action!  I basically used the instructions over at Sam Martin’s blog, which, I may add, he posted about in April of this year.  I’m not the only n00b.  Plus, who doesn’t have an enterprise system that deals with this sort of stuff already (at least at the types of clients I work with)?

    Perfmon

    1. Open up perfmon
    2. Create a new User Defined Data Collector Set
    3. Choose to create manually after naming
    4. Select Performance Counter Alert
    5. Add in the performance counter you care about (mine was requests executing in asp.net apps 4.0)
    6. Choose the user to run it as
    7. Edit the Data Collector in the Data collector set
    8. Change the sample interval to whatever works for you (I set mine to 60s so we can be on top of issues prior to the users)
    9. Under Alert task, give it a name (e.g. EmailAlert) and give it a task argument (you can combine them to form a sentence like “the value at {date} was {value}”
    10. Start the Data Collector Set
    Schedule Tasks
    1. Open up scheduled tasks
    2. Create a task, not a basic task
    3. Name it the exact same name you did in step 9 above (i.e. EmailAlert)
    4. Check “user is logged in or not” so that it runs all the time
    5. Create a new email action under the Action tab
    6. Enter all the info for from, to, subject, etc.  To send to multiple people, comma separate the addresses.
    7. For the body, type whatever you want, and then $(Arg0) will pass the task argument you made in step 9 above.
    8. Enter the SMTP server.

    Done!

    Since the performance counter was set to an application pool, whenever that pool disappears (IISReset, idle timeout, etc.) the counter stops.

    Currently Reading (could take awhile): [amazon_image id=”B000QCS8TW” link=”true” target=”_blank” size=”medium” ]A Game of Thrones: A Song of Ice and Fire: Book One[/amazon_image]

  • Dummy Files

    We are doing some document uploading to SharePoint, and needed some test files of various sizes.  If you have Visual Studio installed, you have the tools required to make these files.  Just make sure you run as administrator, and use the following command.

    FSUTIL FILE CREATENEW 100MBTest.mdb 104857600
    
    Usage: FSUTIL FILE CREATENEW [Filename] [Size in bytes]

  • Migration to WordPress Network

    Well, the migration to a WordPress Network is nearing completion.  All sites, except for my largest (in terms of uploaded content) have been migrated, and figuring out what I’m going to do with my Director install.  This will save me so much time with upgrades and various other maintenance from now on.

    I know that I had a post about this earlier, specifically how to do this on Dreamhost, but I thought I’d provide a few additional insights.

    1. If you are using subdomains instead of folders, do not have Dreamhost tack on www to the root Network domain until after you’ve migrated all sites over.  If you aren’t migrating, but simply setting up all new sites, then you don’t have to worry about it.  This causes issues because it tries to resolve subdomain.domain.com to www.subdomain.domain.com, which doesn’t exist.
    2. I am using full domains to map to the subdomains via the WordPress MU Domain Mapping plugin.  If you want to have non-www resolve to www, make sure www.notnetworkdomain.com is your primary and notnetworkdomain.com is also listed.
    3. If you have the Domain Mapping already setup to point to your www.notnetworkdomain.com during the migration, when you try to hit subdomain.networkdomain.com, it will redirect you to www.notnetworkdomain.com.  That is, reverse mappings happen also, which is cool.
    4. Depending on when you setup your domains at Dreamhost, it is possible their A records point to different IP addresses.  Mirroring only works when they have the same A record IP.  This happened to me on quite a few domains.  To fix, simply delete the domain from Dreamhost, then recreate it as a mirror.  Once the mirror is created, re-setup your custom MX records or any other DNS records you originally had.
    5. Don’t import entries on a crappy network connection.  It doesn’t work and continuously times out.

    Outside of those issues, it went fairly painlessly.  It also helps if you have someplace that has quickly updating DNS.  I can’t tell you how many times I saw the “bad_http_conf” error, while I was waiting for DNS to propagate.

    I’m also amazed at how easily most of the plugins have worked with the network config.  I was really expecting more pains in that area.

  • Upgrade K2 Workflow Instances to a Specific Workflow Version

    We were having a specific issue in our Dev and QA environments where K2 was consuming over 16GB of disk space, and subsequently causing our server to run out of disk space.  We had an interim workaround of restarting the K2 service, but within a day of testing, it was possible that K2 would eat it all up again.

    This was happening because workflow instances (cases in our example) are tied to specific versions of the workflow.  Similar to .NET websites, in order for you to use a specific version it has to do a lot of pre-compiling.  Now, I’m not sure why it was using so much disk space per version, but that is essentially what was causing all our issues.

    There are a few things that could’ve made this better:

    1. Testers and Developers not using old cases which are tied to older versions
    2. Building our K2 workflows in Release instead of Debug

    Turns out option #2 reduces the space an individual version uses by orders of magnitude.  Sadly, there is no way to retrofit the processes that are actually already in K2.

    The actual solution is to use some of the new APIs, specifically the Live Instance Management APIs (oh and that took awhile to find via searching).  The downside is that these APIs were added in 4.5, so anyone on a version prior to that is screwed.  Thankfully we were on 4.5.1!

    Anyways, if you’re lazy, there is actually an already created utility on K2 Underground, and you can find out some additional info about it too.

    Just be prepared for it not to work all that great.  We received a ton of Null Reference errors while running the utility against our large database.  It seemed to work fine in our POC, but not against the real thing.  Some cases were changed, but not all, and we still had the same issue.

    In the end, we had to manually go and delete the old cases in K2, which is definitely not supported.  However, our app handles it gracefully, so it wasn’t a huge deal.

  • Clean-Up Winxs Folder

    The folder gets large, but that is because of all the patches that are applied to your installation.  Do not simply delete the files, because it could cause all sorts of havoc.  The only way to actually reduce the size somewhat, is to have Windows remove any previous updates after you install a service pack.  You can do this with the following command.  I was able to clean up 4GB using the command.  Nothing major, but amost 25% improvement.

    dism /online /cleanup-image /spsuperseded

    Run as administrator.

     

  • WordPress Network on Dreamhost

    I’m just copying these instructions here, since I can’t seem to reliably access the website they are currently hosted on.

    I think I’m going to change a few ways things are hosted here on my collection of sites.  It is a pain to keep them all updated and managed.  Therefore, I am looking into migrating to a WordPress Network (Multi-Site) setup.  However, using subdomains doesn’t work with Dreamhost because of Dreamhost not allowing wildcards.

    Here is the work-around (verbatim) from a fellow Dreamhost user.

    1. Fol­low the direc­tions for set­ting up your Word­Press mul­ti­site (I will assume that you already know or have found the direc­tions to do so).
    2. Pick the sub­do­main fea­ture while in setup.
    3. Ignore the mes­sage you get say­ing that things may not work prop­erly because of the Wild­card missing.
    4. When you’ve fol­lowed all those won­der­ful instruc­tions, cre­ate a test blog. You may end up with a URL like domain.comdatabase_name. I did, and con­tinue to do so. Just edit that blog’s set­tings (Super Admin -> Sites -> Edit) so that the Path field only has a / in it. I’m unsure if this is a prob­lem that user’s at other hosts have, or if it is exclu­sive and will soon be fixed, but once it is changed all is well.
    5. Go to your Dreamhost Web Panel and mir­ror a sub­do­main for the blog you just added to the domain you set the mul­ti­site up on. NOTE: If you cre­ate blog.domain.comin Word­Press mul­ti­site, you must cre­ate the same sub­do­main in your Web Panel — blog.domain.com. Visit blog.domain.com and ver­ify that it leads you to the new blog you created.
    6. If that works, then you need to install the Domain Map­ping plu­gin for multisite.
    7. This gives you a new menu to add a domain to your net­work and “park” it over a subdomain.
    8. All you have to do is fol­low the direc­tions there to add the domain, and you are set.
    9. Just mir­ror that same domain to the mul­ti­site domain in your Dreamhost Web Panel, and every­thing should be in work­ing order.

    Now on to testing!

  • Self Signed SSL Certs

    I’ve always hated creating self-signed SSL certs.  It never seemed like there was a good and easy way to accomplish this.  Yes, you could download the II6 Resource Kit, but that’s just one more thing I don’t need on my machine.

    Well, in IIS7, there is actually an option to automatically create one.  Technet has a good walkthrough of it.  However, there are some limitations:

    • The common name is always the machine name of your IIS server.
    • The certificate is only valid for one week.
    • The certificate is not added to the “Trusted Root Certificate Authorities” of any browser.

    Thankfully a remake of SelfSSL was created for IIS7 and it is more powerful, and easier to use.   Take a look at Thomas Deml’s post on it to see the syntax and download the program.  I’ve also uploaded it here, just in case anything happens to that site.

    SelfSSL7

  • More Visual Studio 2010 Performance Testing "Fun"

    This is a continuation of my previous post on the half-baked core features of load testing in Visual Studio 2010.  We had been progressing fairly well, but with some of the new fixes that have gone into the application, we have reached new issues.

    I would like to preface this with our application is by no means great.  In fact, it is pretty janky and does a lot of incredibly stupid things.  Having a 1.5MB viewstate (or larger) is an issue, and I get that.  However, the way that VS handles it is plain unacceptable.

    With that said, I’m sure you can imagine where this is going.  When running an individual webtest each request cannot be larger than 1.5MB.  This took a bit of time to figure out as many of our tests were simply failing.  The best part of this is that we have a VIEWSTATE extract parameter (see #1 on the previous post), and the error we always get is that the VIEWSTATE cannot be extracted.  Strange, I see it in the response window when I run the webtest.  Oh, wait, does that say Response (Truncated)?  Oh right, because my response is over 1.5MB.

    Oh, and that’s not just truncated for viewing, that’s truncated in memory.  Needless to say this has caused a large amount of issues for us.  Thankfully, VS 2010 is nice and you can create plugins to get around this (see below).  The downside is that VS has obviously not been built to run webtests with our complexity, and definitely not bypassing the 1.5MB sized response.

    public override void PreWebTest(object sender, PreWebTestEventArgs e)
            {
                e.WebTest.ResponseBodyCaptureLimit = 15000000;
                e.WebTest.StopOnError = true;
                base.PreWebTest(sender, e);
            }
    

    If you use this plugin, be prepared for a lot of painful hours in VS.  I am running this on a laptop with 4GB of RAM, and prior to the webtest running devenv.exe is using ~300MB of RAM.  However, during the test, that balloons to 2.5GB and pegs one of my cores at 100% utilization as it attempts to parse all the data.     Fun!

    The max amount of data we could have in the test context is 30MB.  Granted, as mentioned earlier, this is a lot of text.  However, I fail to see how it accounts for almost 100x that amount in RAM.

    Thankfully, in a load test scenario all that data isn’t parsed out to be viewed and you don’t have any of these issues.  You just need to create perfect scripts that you don’t ever need to update.  Good luck with that!

    Oh and as an update, for #3 in my previous post, I created a bug for it, but haven’t heard anything back.  Needless to say, we are still having the issue.

    And I realize that we’ve had a lot of issues with VS 2010, and I get angry about it.  However, I want to re-iterate that no testing platforms are good.

     

  • Windows Home Server 2011 Released

    I was poking around on MSDN, and noticed that WHS 2011 actually went live sometime last month.  Well, you know what that means…time to upgrade!  Unfortunately, there isn’t really an upgrade path, as you need to backup and then restore all your data.  Thankfully, since my instance of WHS is virtualized, no need to do all that work.  Instead I just bring up a separate virtual machine, and then copy the data over.

    I’m sure that I’ll run into issues and various other items.  As I do, I’ll be sure to share them here, so keep an eye on this space.

     

  • A Better Backup Plan

    For the longest time, I’ve been using Carbonite as my backup provider.  I can’t say that I was ever really unhappy with it, but I was hoping to find something better as my yearly subscription was expiring.  Some of the issues I was trying to move away from:

    • As was previously mentioned, I have a WHS machine and getting Carbonite to work correctly with it wasn’t as easy as I had hoped.  Since Carbonite wouldn’t follow the tombstone files, I was forced to have my system as a single drive configuration.  Not all bad, and since it was virtualized anyways, it didn’t matter.
    • The UI is very slow.
    • While you can backup an “unlimited” amount of data, somewhere around 50-100GB they start to throttle you significantly.  When I recently added our wedding photos (11GB), it was going to take over 4 days to just add the incremental amount.
    • Very vanilla without many options.

    I hadn’t really been shopping around, but when Mozy announced their pricing changes, it sort of peaked my interest to look around again.  I had heard about the Mozy price changes over at TechCrunch, and was reading in their comments about where the various people were flocking to now.  That’s where I found out about CrashPlan, or at least I thought I knew what it was all about.

    Well, my Carbonite plan has about 2 months left on it now, and since I have a fair amount of data to backup (just over 200GB), I figured now was the time to make the move.  I mean, since I had based my upload figures on Carbonite’s speeds, that should just about cover the amount of time.

    And man, am I glad I moved!  There are so many awesome features in CrashPlan that not only am I going to be using it, I’m going get others in the family to use it too.

    First, just like Carbonite, I can pay to have my data up in the cloud.  There are a lot of similar items to Carbonite, but there are some nice advanced options:

    • Easily select which folders you want to upload – Same for both
    • Runs as a service, so you don’t have to be logged in – Same for both
    • Personal Encryption Key – Same for both
    • Follows junctions and tombstone files – Only CrashPlan
    • Can rent a 1TB drive to seed the initial upload (didn’t use, but nice option) – Only CrashPlan
    • Backup sets to have different backup intervals – Only CrashPlan
    • Backs up all file types (unless excluded through filter) – Only CrashPlan
    • No throttling, but can specify client side throttling based on multiple factors – Only CrashPlan

    Mind you, those are just for the basic items that Carbonite offers (did I mention CrashPlan is cheaper too?).  However, CrashPlan also has a ton of other features in case you don’t want to upload to their cloud.  The best part?  If you don’t use the cloud services, you don’t have to pay for it.

    This is a great feature for those that have a lot of storage in an always-on system and want to make a private cloud solution for family members.  It is actually something I’ve been trying to find so that my parents have a trusted cloud-based backup solution on my hardware.

    The even better part?  Based on my testing with my work laptop, it just works!  I have a fairly complex networking structure at home, and while in the office, my laptop was able to connect and start backing up with no issues.  The only difference between what I did, and what my parents will need to do is create an account and “link” it to mine with a backup code that is unique to me.  From there, it starts to sync and they are off to the races.  I can specify a quota for them server side too, so it doesn’t go crazy.

    Overall, I wish I had migrated earlier.  I definitely don’t feel bad in moving away from Carbonite, now that I’ve actually played with the software.  It solves all my initial issues, plus solves an ongoing problem I’ve been trying to fix.  Definitely a huge plus!  In fact, based on my experience, I would definitely consider their business service for an initial startup.  Just sayin’.