Synology DS1511+ and CrashPlan

These instructions were ripped verbatim from Kenneth Larsen’s blog because it just worked!  You can either use vi or nano to edit the files.

  1. Download the latest release of the Linux Crashplan Client from Crashplan website along with the client for your operating system if you use another operating system then Linux
  2. Upload the Linux client to the NAS and login to your NAS as root using SSH.
  3. You need to installe ipkg in order to do so. If haven’t done so already you can follow this guide: http://forum.synology.com/wiki/index.php/Overview_on_modifying_the_Synology_Server,_bootstrap,_ipkg_etc#What_is_a_Synology_Server
  4. You will need to install a few extra packages at your NAS from the command line:
    1. ipkg update
    2. ipkg install nano
    3. ipkg install cpio
    4. ipkg install bash
    5. ipkg install coreutils
  5. Move the uploaded client package to /opt
  6. Unpack the crashplan client: tar -xvf name_of_downloaded_archive_file
  7. Modify the install.sh script in the newly created directory to use bash as your shell. The first line in the script should be replaced with this one: #!/opt/bin/bash
  8. Install the crashplan using the options below. When asked about java allow it to be downloaded:
    • CrashPlan will install to: /opt/crashplan
    • And put links to binaries in: /opt/bin
    • And store datas in: /opt/crashplan/manifest
    • Your init.d dir is: /etc/init.d
    • Your current runlevel directory is: /usr/syno/etc/rc.d
  9. Modify the /opt/crashplan/bin/run.conf by adding  -Djava.net.preferIPv4Stack=trueas an additional option at the end of the two confiurations (This was already added when I did the install)
  10. Remove commandline options for the ps process in the /opt/crashplan/bin/CrashPlanEngine file since ps doesnt accept parameters at the synology NAS: sed -i ‘s/ps -eo /ps /’  CrashPlanEngine;sed -i ‘s/ps -p /ps /’  CrashPlanEngine
  11. Modify the /usr/syno/etc/rc.d/S99crashplan file line 1 to : #!/opt/bin/bash
  12. Modify the /opt/crashplan/bin/CrashPlanEngine file line 1 to: #!/opt/bin/bash
  13. Modify the /opt/crashplan/bin/CrashPlanEngine file line 14 with a full path for nice to: /opt/bin/nice
  14. Start your crashplan service /usr/syno/etc/rc.d/S99crashplan start
  15. Validate that your service is running: netstat -an | grep ‘:424.’ should give to listeners:
    • tcp        0      0 0.0.0.0:4242            0.0.0.0:*               LISTEN
    • tcp        0      0 127.0.0.1:4243          0.0.0.0:*               LISTEN
  16. Edit /etc/rc.local and add “/usr/syno/etc/rc.d/S99crashplan start” without quotes, it seems to load after restart.
  17. Install your desktop client and point it towards the headless service you just installed. Follow the instructions in the crashplan website for this (http://support.crashplan.com/doku.php/how_to/configure_a_headless_client)

Update 8/19: Just to post a reply to this, there is a much better way to get this working now (and is what I use).  Check it out at http://pcloadletter.co.uk/2012/01/30/crashplan-syno-package/

Published
Categorized as computers

New Home Server Setup

I’ve been meaning to do this for awhile, but I haven’t found a suitable replacement until recently.  I am decommissioning the server at home.  It’s loud, large, and sucks down a lot of power for what I use it for (windows home server).  It was nice because I could quickly and easily spin up some VMs and poke around, but I’ll still be able to do that.

Instead, I picked up a Synology DS1511+ NAS.  This little appliance is pretty darn slick.  It can pretty much do everything I was doing, in a smaller, quieter, and cooler form factor.  Since it uses an Atom processor, it runs a fairly familiar flavor of Linux, so you can do quite a bit with it.  Plus, a lot of the default stuff it comes with is quite nice!

I’ll be throwing up a few copy/pastes on the site so that I can quickly re-reference.  Oh, and there’s another SharePoint article in the works too.  Busy, busy!

SharePoint 2010 Synthetic File Data

Still trying to work through creating synthetic data for an out-of-the-box SharePoint performance test.  To create the data, create a new site collection (so it doesn’t interfere with anything else and is easy to clean up), and uploads all the test data.  The biggest downside right now is that the data is created and then uploaded, which requires enough disk space to make the data.  Not a huge issue for me, but possibly for you.

General idea came from a few places for the upload, and locally for the file creation.

#USER Defined Variables
#Specify the extension type of files you want uploaded
$strDocTypes = @(".docx",".xlsx", ".pptx", ".pdf")
#The max amount of data generated in MB
$maxSize = 50
#The max size one file could be in MB
$maxFileSize = 10
#Intermediate folder where the test data is placed
$fileSource = "F:TestData"
#New Content Database (for easy removal)
$dbName = "Portal_ContentDB2"
#New Site collection template
$template = "SPSPORTAL#0"
#Account owner
$siteOwner = "TESTAdministrator"
#Web Application address
$webApp = "https://portal"
#Site Collection Address
$siteCollection = "/sites/1"
# DO not edit anything beyond this line

#Create all the test data using FSUTIL

$rand = New-Object system.random
do {
	$guid = [guid]::NewGuid()
	$guid =  $guid.ToString()
	$fileName = $guid+$strDocTypes[$rand.next(0,$strDocTypes.length)]
	$rand1 = $rand.nextdouble()
	$rand2 = $rand.nextdouble()
	$rand3 = $rand.nextdouble()
	[int]$fileSize = 1048576*$rand1*$rand2*$rand3*$maxFileSize
	FSUTIL FILE CREATENEW $fileName $fileSize
	$fileTotalBytes = $fileTotalBytes + $fileSize
	$fileTotal = $fileTotalBytes/1024/1024
}
#Data generation keeps going until the amount of data is > $maxSize
while ($fileTotal -le $maxSize)

#Creation of the new content database and site collection
$siteCollectionURL = $webApp + $siteCollection
New-SPContentDatabase $dbName -WebApplication $webApp
New-SPSite -url $siteCollectionURL -OwnerAlias $siteOwner -Name "Test Doc Library" -Template $template -ContentDatabase $dbName

#uploading of all the generated data into the $siteCollectionURL/Documents folder
$spWeb = Get-SPWeb -Identity $siteCollectionURL
$listTemplate = [Microsoft.SharePoint.SPListTemplateType]::DocumentLibrary
$spFolder = $spWeb.GetFolder("Documents")
$spFileCollection = $spFolder.Files
Get-ChildItem $fileSource | ForEach {
	$spFileCollection.Add("Documents/$($_.Name)",$_.OpenRead(),$true)
}

SharePoint 2010 Load Testing Kit

Was looking for ways to generate synthetic test data for a SharePoint out-of-the-box install today, and ran into the SharePoint 2010 Load Testing Kit.  While it doesn’t help me in this stage of the project, I could see it being useful later or on other projects.

There appears to be a lot of dependencies though:

  • Migration from 2007 to 2010
  • As it collects info from your log files, you’ll need to have everything migrated for the scripts to work
    • Data
    • Apps
    • Site Collections
    • Etc.

Could be hot though!

Migration to WordPress Network Part 3

I haven’t talked about this in awhile, but everything has been running smoothly. Having only two instances I need to worry about is definitely better than the 5+.

However, today, I wanted to add a subdomain to a domain that is hosted in the WordPress Network. It took a few minutes to remember what I had done (thankfully all the articles I already read helped), but a few minutes later I had a subdomain running.

Essentially it is the same setup as before:

  1. Create the website in the WordPress Network Admin site (i.e. subdomainA.rebelpeon.com)
  2. Create the subdomain mirror entry in the Dreamhost panel under your main WordPress Network domain (i.e. subdomainA.rebelpeon.com)
  3. Create the subdomain mirror entry in the Dreamhost panel for the site you want (i.e. subdomainA.displaydomain.com)
  4. Add in the domain mapping
  5. Celebrate!

Search Schedule Script

To setup the crawl configuration for the default local sites, you can use the script below:

$ssaName="Search Service Application"
$context=[Microsoft.Office.Server.Search.Administration.SearchContext]::GetContext($ssaName)

$incremental=New-Object Microsoft.Office.Server.Search.Administration.DailySchedule($context)
$incremental.BeginDay="23"
$incremental.BeginMonth="10"
$incremental.BeginYear="2011"
$incremental.StartHour="0"
$incremental.StartMinute="00"
$incremental.DaysInterval="1"
$incremental.RepeatInterval="720"
$incremental.RepeatDuration="1440"

$fullCrawl=New-Object Microsoft.Office.Server.Search.Administration.WeeklySchedule($context)
$fullCrawl.BeginDay="23"
$fullCrawl.BeginMonth="10"
$fullCrawl.BeginYear="2011"
$fullCrawl.StartHour="6"
$fullCrawl.StartMinute="00"
$fullCrawl.WeeksInterval="1"
$contentsource = Get-SPEnterpriseSearchCrawlContentSource -SearchApplication $ssaName -Identity "Local SharePoint Sites"

$contentsource.IncrementalCrawlSchedule=$incremental
$contentsource.FullCrawlSchedule=$fullCrawl
$contentsource.Update()

SQL Server Issues

Last week I was beating my head against the table, because a VM I had quickly created wasn’t allowing SQL to install.  I kept receiving the following error in the detailed SQL error log:

Configuration action failed for feature SQL_Engine_Core_Inst during timing ConfigRC and scenario ConfigRC.
External component has thrown an exception.
The configuration failure category of current exception is ConfigurationFailure
Configuration action failed for feature SQL_Engine_Core_Inst during timing ConfigRC and scenario ConfigRC.
System.Runtime.InteropServices.SEHException: External component has thrown an exception.
at Microsoft.Win32.SafeNativeMethods.CloseHandle(IntPtr handle)
at System.Runtime.InteropServices.SafeHandle.InternalDispose()
at System.Runtime.InteropServices.SafeHandle.Dispose(Boolean disposing)
at System.Diagnostics.Process.Close()
at System.Diagnostics.Process.Dispose(Boolean disposing)
at System.ComponentModel.Component.Dispose()
at Microsoft.SqlServer.Configuration.SqlEngine.SqlServerServiceBase.WaitSqlServerStart(Process processSql)
at Microsoft.SqlServer.Configuration.SqlEngine.SqlEngineDBStartConfig.ConfigSQLServerSystemDatabases(EffectiveProperties properties, Boolean isConfiguringTemplateDBs, Boolean useInstallInputs)
at Microsoft.SqlServer.Configuration.SqlEngine.SqlEngineDBStartConfig.DoCommonDBStartConfig(ConfigActionTiming timing)
at Microsoft.SqlServer.Configuration.SqlConfigBase.SlpConfigAction.ExecuteAction(String actionId)
at Microsoft.SqlServer.Configuration.SqlConfigBase.SlpConfigAction.Execute(String actionId, TextWriter errorStream)
Exception: System.Runtime.InteropServices.SEHException.
Source: System.
Message: External component has thrown an exception.

It turns out that I accidentally downloaded the debug check build version of Windows 2008 R2 SP1, and well, you can’t install SQL with that version.  Needless to say, the error message makes this completely obvious.  Found the hint to look at the ISO I was using on MSDN social.

LAFHA Math

Just throwing this down as finding this math or an explanation of it has eluded us until we actually could do it based on paystubs.  LAFHA is an in/out transaction (deduction and then addition).

  • Monthly Taxable Income =  Monthly Gross – LAFHA
  • Figure Taxes based on Monthly Taxable Income
  • Monthly Net Pay = Monthly Taxable Income – Taxes + LAFHA
Published
Categorized as work

CloudFlare

I was using CloudFlare for awhile, but realized it was causing a few issues.  In practice it sounds like a great idea, essentially the same as any CDN, but free.  However, when editing posts on other websites, WordPress was never able to adjust the ajax to format the  text in the visual viewer.  It wasn’t really that it didn’t render it, it was more that it would take until a timeout is hit to render anything in the body field.

Simply not acceptable.  As soon as I deactivated CloudFlare, it worked great.  Needless to say, it’s not something I really need, so I just removed it.

Migration to WordPress Network Part 2

I had two outstanding items to figure out before migrating my last site.  Today I was able to knock off one.

My Director installation used to be a directory under my website.  Unfortunately, with migrating to a wordpress network, that wouldn’t work.  This is because everything is done via DNS redirection and so a directory doesn’t physically sit where you think it does.  I can only think of the nightmares it could cause.

Instead, I moved it to a sub domain.  This seemed to fix all the issues, and it is actually pretty nice there.  I just had to update a few links on various pages, make a few php.ini updates, and all was well.

Now it’s on to the massive site.  I think I have to import only a few records at a time.  Turns out, with all the media attached to each post, it kept timing out.

Update: Well, that was a fun experiment.  Since I can’t seem to upload any of my previous entries (Dreamhost kills the script), I’ve decided things are working ok with two WordPress installs, and that’s how it will stay.  The other site is huge anyways, so it makes sense…