nopCommerce Install

The other day I was playing around with nopCommerce.  There was some talk about it internally, and I thought I’d see what it was all about.  I didn’t get very far, and realized the installation instructions were definitely missing a few steps.  The guys over there have outlined most of the steps in the documentation, but they’ve forgotten a few:

  1. Ensure that your worker process (what the AppPool runs under) has the ability to create a database if you check the box Create database if it doesn’t exist.
  2. How to access the installation page.  You need to browse to http://site/views/install/default.aspx

There are other OWASP and scalability best practices that I may go into later if I really dig down further, but three that immediately come out:

  1. Unencrypted DB Connection string info
  2. compilation debug=”true” being set in the web.config
  3. Single DB
Published
Categorized as IIS, work

Synology DS1511+ and Crontab

I’ve added an rsync job to my crontab file in order to backup all my websites I have being served from Dreamhost (including this one).  The specific job is set to run every night at midnight starting last night.  Unfortunately, it didn’t run.

This is because the crontab service needs to be recycled in order to grab the new jobs (also, don’t update your DSM, because that seems to blow it away).  As this is a non-standard linux distro, you need to restart crontab the following way:

/usr/syno/etc.defaults/rc.d/S04crond.sh stop
/usr/syno/etc.defaults/rc.d/S04crond.sh start

Published
Categorized as synology

Synology Plex Media Server and Samsung Smart TV Client

This one wasn’t completely obvious, but I think I’ve managed to figure it out.  It at least appears to be working correctly, assuming it continues to work a bit better after the media scan is complete.

Steps for the Server:

  1. Grab the spk from http://www.plexapp.com/linux/linux-pms-download.php.
  2. Log into DSM and in Package Installer, install the downloaded spk.
  3. After it is installed, visit the website at http://<nas-server>:32400.  It doesn’t look like the shortcut that is created works.
  4. Add in the locations to your media.

The steps for the Client on a Samsung TV with SmartHub is broken up into two options: installer hosted on your own server, or on someone elses.  It doesn’t matter where you get the installer from, as you can specify the Plex Server after the application is installed.

Hosted on your NAS:

  1. In Control Panel, enable web station under web services
  2. Copy the installer (link) to the web share that was created in step 1
  3. Copy the widgetlist.xml (link) to the web share that was created in step 1
  4. Edit the widgetlist.xml to contain the IP of your NAS (or the URL where the installer is located)
  5. On the TV, open the Smart Hub
  6. Log in as a different user (A/red button)
    • User: develop
    • Password: 123456
  7. Click the Settings button (D/blue button)
  8. Select Development
  9. Set the Server IP to that of your NAS
  10. Select User Application Synchronisation
  11. Once the installation is finished, restart your TV
  12. Visit SmartHub and Plex is installed.
  13. Point Plex at your Plex Media server.

Hosted by someone else:

  1. On the TV, open the Smart Hub
  2. Log in as a different user (A/red button)
    • User: develop
    • Password: 123456
  3. Click the Settings button (D/blue button)
  4. Select Development
  5. Set the Server IP to 92.50.72.58
  6. Select User Application Synchronisation
  7. Once the installation is finished, restart your TV
  8. Visit SmartHub and Plex is installed.
  9. Point Plex at your Plex Media server.

These install instructions were taken from the Plex forums.

Update 1/6/2012: The crawler has completed, and it does actually work!  I also found out that it only supports TV shows right now, and not music or photos.  Looking into it, it’s just a webpage with a lot of javascript.  If I have time, I may look to add music in, as having one solution for everything is a lot better than both this and DLNA!

Update 8/19/2012: Instead of going through all of this, just grab the Plex app from the Samsung App Store!

Published
Categorized as synology

SharePoint 2010 User Profile Sync: stopped-extension-dll-exception

Well, it’s good to see that User Profile Sync can be better in 2010 than it was in 2007.  However, there are definitely some issues still.

The first one, which is something we just noticed was that the User Profile Sync jobs were constantly failing.  Unfortunately, there isn’t really a good way to know without going into the MIISClient program to look at the errors.  Basically, if you think, for whatever reason, profile sync is not working, open up the MIISClient.exe (Program FilesMicrosoft Office Servers14.014.0Synchronization ServiceUIShell) as the farm account and take a look to see if everything is a success.

For us, we were seeing all the MOSS-{guid} jobs failing with the error stopped-extension-dll-exception as you can see below.

Based on the lovely error message, I’m still amazed that I was able to isolate this issue (event logs reported that CA was being accessed via a non-registered name).  However, it turns out it is because of alternate access mappings (AAMs) for the central admin (CA) website.  Normally, SharePoint sets up the AAM for CA as the machine name you first install SharePoint on to.  However, we changed the AAM to be a more friendly name.

When you update the “Public URL for Zone” for the CA website, it does not propagate the change into the MIISClient.  This causes the MIISClient to not correctly access the CA APIs for the user profile sync (or at least I am imagining this is the case).

Fix it with the following steps:

  1. MIISClient.exe as the farm account.
  2. Tools > Management Agents (or click the Management Agents in the bar)
  3. Right-click on the MOSS-{guid} management agent and select Properties
  4. Go to the Configure Connection Information section in the left-hand pane
  5. In the connection information box, change the Connect To URL to be the same URL as listed as the “Public URL for Zone” for your CA in the AAM configuration.
  6. Re-enter the farm account username and password for good measure
  7. Save the configuration
  8. Run a full profile sync from CA

Synology DS1511+ and CrashPlan

These instructions were ripped verbatim from Kenneth Larsen’s blog because it just worked!  You can either use vi or nano to edit the files.

  1. Download the latest release of the Linux Crashplan Client from Crashplan website along with the client for your operating system if you use another operating system then Linux
  2. Upload the Linux client to the NAS and login to your NAS as root using SSH.
  3. You need to installe ipkg in order to do so. If haven’t done so already you can follow this guide: http://forum.synology.com/wiki/index.php/Overview_on_modifying_the_Synology_Server,_bootstrap,_ipkg_etc#What_is_a_Synology_Server
  4. You will need to install a few extra packages at your NAS from the command line:
    1. ipkg update
    2. ipkg install nano
    3. ipkg install cpio
    4. ipkg install bash
    5. ipkg install coreutils
  5. Move the uploaded client package to /opt
  6. Unpack the crashplan client: tar -xvf name_of_downloaded_archive_file
  7. Modify the install.sh script in the newly created directory to use bash as your shell. The first line in the script should be replaced with this one: #!/opt/bin/bash
  8. Install the crashplan using the options below. When asked about java allow it to be downloaded:
    • CrashPlan will install to: /opt/crashplan
    • And put links to binaries in: /opt/bin
    • And store datas in: /opt/crashplan/manifest
    • Your init.d dir is: /etc/init.d
    • Your current runlevel directory is: /usr/syno/etc/rc.d
  9. Modify the /opt/crashplan/bin/run.conf by adding  -Djava.net.preferIPv4Stack=trueas an additional option at the end of the two confiurations (This was already added when I did the install)
  10. Remove commandline options for the ps process in the /opt/crashplan/bin/CrashPlanEngine file since ps doesnt accept parameters at the synology NAS: sed -i ‘s/ps -eo /ps /’  CrashPlanEngine;sed -i ‘s/ps -p /ps /’  CrashPlanEngine
  11. Modify the /usr/syno/etc/rc.d/S99crashplan file line 1 to : #!/opt/bin/bash
  12. Modify the /opt/crashplan/bin/CrashPlanEngine file line 1 to: #!/opt/bin/bash
  13. Modify the /opt/crashplan/bin/CrashPlanEngine file line 14 with a full path for nice to: /opt/bin/nice
  14. Start your crashplan service /usr/syno/etc/rc.d/S99crashplan start
  15. Validate that your service is running: netstat -an | grep ‘:424.’ should give to listeners:
    • tcp        0      0 0.0.0.0:4242            0.0.0.0:*               LISTEN
    • tcp        0      0 127.0.0.1:4243          0.0.0.0:*               LISTEN
  16. Edit /etc/rc.local and add “/usr/syno/etc/rc.d/S99crashplan start” without quotes, it seems to load after restart.
  17. Install your desktop client and point it towards the headless service you just installed. Follow the instructions in the crashplan website for this (http://support.crashplan.com/doku.php/how_to/configure_a_headless_client)

Update 8/19: Just to post a reply to this, there is a much better way to get this working now (and is what I use).  Check it out at http://pcloadletter.co.uk/2012/01/30/crashplan-syno-package/

Published
Categorized as computers

New Home Server Setup

I’ve been meaning to do this for awhile, but I haven’t found a suitable replacement until recently.  I am decommissioning the server at home.  It’s loud, large, and sucks down a lot of power for what I use it for (windows home server).  It was nice because I could quickly and easily spin up some VMs and poke around, but I’ll still be able to do that.

Instead, I picked up a Synology DS1511+ NAS.  This little appliance is pretty darn slick.  It can pretty much do everything I was doing, in a smaller, quieter, and cooler form factor.  Since it uses an Atom processor, it runs a fairly familiar flavor of Linux, so you can do quite a bit with it.  Plus, a lot of the default stuff it comes with is quite nice!

I’ll be throwing up a few copy/pastes on the site so that I can quickly re-reference.  Oh, and there’s another SharePoint article in the works too.  Busy, busy!

SharePoint 2010 Synthetic File Data

Still trying to work through creating synthetic data for an out-of-the-box SharePoint performance test.  To create the data, create a new site collection (so it doesn’t interfere with anything else and is easy to clean up), and uploads all the test data.  The biggest downside right now is that the data is created and then uploaded, which requires enough disk space to make the data.  Not a huge issue for me, but possibly for you.

General idea came from a few places for the upload, and locally for the file creation.

#USER Defined Variables
#Specify the extension type of files you want uploaded
$strDocTypes = @(&quot;.docx&quot;,&quot;.xlsx&quot;, &quot;.pptx&quot;, &quot;.pdf&quot;)
#The max amount of data generated in MB
$maxSize = 50
#The max size one file could be in MB
$maxFileSize = 10
#Intermediate folder where the test data is placed
$fileSource = &quot;F:TestData&quot;
#New Content Database (for easy removal)
$dbName = &quot;Portal_ContentDB2&quot;
#New Site collection template
$template = &quot;SPSPORTAL#0&quot;
#Account owner
$siteOwner = &quot;TESTAdministrator&quot;
#Web Application address
$webApp = &quot;https://portal&quot;
#Site Collection Address
$siteCollection = &quot;/sites/1&quot;
# DO not edit anything beyond this line

#Create all the test data using FSUTIL

$rand = New-Object system.random
do {
	$guid = [guid]::NewGuid()
	$guid =  $guid.ToString()
	$fileName = $guid+$strDocTypes[$rand.next(0,$strDocTypes.length)]
	$rand1 = $rand.nextdouble()
	$rand2 = $rand.nextdouble()
	$rand3 = $rand.nextdouble()
	[int]$fileSize = 1048576*$rand1*$rand2*$rand3*$maxFileSize
	FSUTIL FILE CREATENEW $fileName $fileSize
	$fileTotalBytes = $fileTotalBytes + $fileSize
	$fileTotal = $fileTotalBytes/1024/1024
}
#Data generation keeps going until the amount of data is &gt; $maxSize
while ($fileTotal -le $maxSize)

#Creation of the new content database and site collection
$siteCollectionURL = $webApp + $siteCollection
New-SPContentDatabase $dbName -WebApplication $webApp
New-SPSite -url $siteCollectionURL -OwnerAlias $siteOwner -Name &quot;Test Doc Library&quot; -Template $template -ContentDatabase $dbName

#uploading of all the generated data into the $siteCollectionURL/Documents folder
$spWeb = Get-SPWeb -Identity $siteCollectionURL
$listTemplate = [Microsoft.SharePoint.SPListTemplateType]::DocumentLibrary
$spFolder = $spWeb.GetFolder(&quot;Documents&quot;)
$spFileCollection = $spFolder.Files
Get-ChildItem $fileSource | ForEach {
	$spFileCollection.Add(&quot;Documents/$($_.Name)&quot;,$_.OpenRead(),$true)
}

SharePoint 2010 Load Testing Kit

Was looking for ways to generate synthetic test data for a SharePoint out-of-the-box install today, and ran into the SharePoint 2010 Load Testing Kit.  While it doesn’t help me in this stage of the project, I could see it being useful later or on other projects.

There appears to be a lot of dependencies though:

  • Migration from 2007 to 2010
  • As it collects info from your log files, you’ll need to have everything migrated for the scripts to work
    • Data
    • Apps
    • Site Collections
    • Etc.

Could be hot though!

Migration to WordPress Network Part 3

I haven’t talked about this in awhile, but everything has been running smoothly. Having only two instances I need to worry about is definitely better than the 5+.

However, today, I wanted to add a subdomain to a domain that is hosted in the WordPress Network. It took a few minutes to remember what I had done (thankfully all the articles I already read helped), but a few minutes later I had a subdomain running.

Essentially it is the same setup as before:

  1. Create the website in the WordPress Network Admin site (i.e. subdomainA.rebelpeon.com)
  2. Create the subdomain mirror entry in the Dreamhost panel under your main WordPress Network domain (i.e. subdomainA.rebelpeon.com)
  3. Create the subdomain mirror entry in the Dreamhost panel for the site you want (i.e. subdomainA.displaydomain.com)
  4. Add in the domain mapping
  5. Celebrate!

Search Schedule Script

To setup the crawl configuration for the default local sites, you can use the script below:

$ssaName="Search Service Application"
$context=[Microsoft.Office.Server.Search.Administration.SearchContext]::GetContext($ssaName)

$incremental=New-Object Microsoft.Office.Server.Search.Administration.DailySchedule($context)
$incremental.BeginDay="23"
$incremental.BeginMonth="10"
$incremental.BeginYear="2011"
$incremental.StartHour="0"
$incremental.StartMinute="00"
$incremental.DaysInterval="1"
$incremental.RepeatInterval="720"
$incremental.RepeatDuration="1440"

$fullCrawl=New-Object Microsoft.Office.Server.Search.Administration.WeeklySchedule($context)
$fullCrawl.BeginDay="23"
$fullCrawl.BeginMonth="10"
$fullCrawl.BeginYear="2011"
$fullCrawl.StartHour="6"
$fullCrawl.StartMinute="00"
$fullCrawl.WeeksInterval="1"
$contentsource = Get-SPEnterpriseSearchCrawlContentSource -SearchApplication $ssaName -Identity "Local SharePoint Sites"

$contentsource.IncrementalCrawlSchedule=$incremental
$contentsource.FullCrawlSchedule=$fullCrawl
$contentsource.Update()