The term ‘Get-SPApplicationPrincipal’ is not recognized as the name of a cmdlet, function, script file or operable program.

Ignore what it says on this technet page: http://technet.microsoft.com/en-gb/library/jj219714(v=office.15).aspx

The command is actually Get-SPAppPrincipal.

You can see the correct command here: http://technet.microsoft.com/en-us/library/jj219664(v=office.15).aspx

 

 

Handling HTTP status code 100 in Scrapy

You might have some problems handling the 100 response code in Scrapy.  Scrapy uses Twisted on the backend, which itself does not handle status code 100 properly yet: http://twistedmatrix.com/trac/ticket/5192

The remote server will first send a response with the 100 status code, then a response with 200 status code.  In order to get the 200 response code, sent the following header in your spider:

‘Connection’: ‘close’

If your 200 response is also gzipped, Scrapy might not gunzip, in which case you need to set the following header as well:

‘Accept-Encoding’: ”

And if Scrapy will not do anything with the responses at all, you might need to set the following Spider attribute:

handle_httpstatus_list = [100]

Oh noes! Kivy dependency installation has killed my graphics

If you are in the habit of blindly installing packages onto your personal machine while following installation instructions for some new shiny program, and that current shiny program is Kivy (1.9) and your version of xorg is a bit out of date, you may find a black screen the next time you boot.

This is due to the following kivy dependencies which have upgraded your xorg dependencies and broken xorg:

libgl1-mesa-dev-lts-quantal
libgles2-mesa-dev-lts-quantal

When you try and start xorg, you will get an error similar to:

No such file or directory: /usr/bin/x

 

To fix, upgrade xorg to quantal version:

apt-get install xserver-xorg-lts-quantal

 

I rebooted into failsafe graphics mode, then normal mode (both failed), then normal mode again and everything was working.

Windows console encoding

Filenames on NTFS are encoded in UTF-16.  The windows console is set by default to some other encoding entirely.  This makes working with files with ‘special’ characters in the filenames impossible…

In my case, I was using the following common code to delete files and folders in a directory:

set folder=”C:\test”

cd /d %folder%

for /F “delims=” %%i in (‘dir /b’) do (rmdir “%%i” /s/q || del “%%i” /s/q)

 

But files with certain unicode characters were not being deleted.  To fix this, add the following at the top of the file:

chcp 10000

 

This changes the encoding to UTF-16.

Or if you’re using cmd and the dir command, change the font first to Lucida Console (as the default font has a very limited character set).

Sharepoint 2013 Provider-Hosted App Architecture Notes

Trying to build a Sharepoint 2013 app has probably been the worst experience of my coding life so far.

The Microsoft docs make it sound so easy; there are so many ways you can build an app!  You can use any programming language you like!  Look, we have a REST interface!  Look, mobile app APIs!

Hey awesome,  you think, looking through the initial introductory documentation, yeh all the different information is a bit confusing, but look, they have how tos and the APIs are documented properly, how hard could it be?

Well, after wasting A LOT of time following guides and trying to build solutions that work, here’s some information that happened to be crucial to the architectural decision making of the apps that I didn’t come across until much too late.  Probably it’s wrong, because I’m finding it extremely difficult to get actual facts about the different ways you can build sharepoint apps, despite the millions of confusing articles on the Microsoft site (none of which seem to contain all the information you need to know), and lots of tutorials (written only by people coding in ASP hosting their sites on Azure or using OAuth).

 

Provider-hosted apps using the REST API:

  • You can either use the javascript cross-domain library or use OAuth
  • Using OAuth requires an account with Azure AD and you also need to configure your Sharepoint installation to use Azure AD (and obviously the Sharepoint installation needs access through firewalls etc to communicate to Azure AD).  In addition, the app needs to be registered in Azure.
  • I’ve seen some tutorials that say for testing you just need to register the app in SP and not Azure, and that you don’t need the Azure AD in this case; I couldn’t get this to work.

Provider-hosted apps using high trust:

  • The how-to guides all use a couple of Microsoft provided C# files for the authentication, in addition to Windows Authentication for the site in IIS, and I can’t see any documentation on how the process actually works.  Reading through the files, they get the Windows user information, so I have a feeling this method can only be used for apps built (1) in ASP/C# running on a windows machine, and (2) in the same domain as the sharepoint installation.

 

So if you want to build an app that can modify sharepoint data in any non-Microsoft language, and host it on a non-Windows machine, and don’t want to pay for an Azure subscription, and don’t want to change the authentication method of your sharepoint site, your options are:

  1. Javascript frontend to deal with Sharepoint, plus likely a backend of whatever to do anything you can’t with javascript (use 3rd party APIs etc)
  2. A high trust app to act as a proxy between your app and the sharepoint installation*

*I’m still trying to figure out how it would be possible to send the REST request I want to make to sharepoint to the proxy instead, and have that sign it and forward it on to sharepoint…

Postfix queue management bash scripts

Couple of scripts I used while cleaning up a mail server. I’m sure they can be improved, and the last one is quite specific to my own requirements, but I’ll put them here anyway.

Move emails with a particular subject from the hold queue to the deferred queue:

#change directory to postfix's queue directory#
cd $(postconf -h queue_directory)/hold
#loop over queue files
for i in * ; do
# postcat e file, grep for subject "test" and if found
# run postsuper -d to delete queue'd message
postcat $i |grep -q '^Subject: test' && postsuper -H $i
done

Delete emails in the hold queue that are being sent to a recipient that has already recieved an email (is in the mail log) or duplicate emails (with the same email/subject):

cd $(postconf -h queue_directory)/hold
#loop over queue files
NUM=0
for i in * ; do
   if [ -f "$i" ]; then
       IDENT=$(postcat $i | grep -A 1 "To:")
       RECIPIENT=$(postcat $i | grep "To:" | cut -c 5- )
       if grep -q "$RECIPIENT" /root/postfixtmp/logs/mailsent.log; then
           echo "* already sent to $RECIPIENT, deleting $i " | tee -a /root/postfixtmp/queueclean.log
           echo $IDENT | tee -a /root/postfixtmp/queueclean.log
           NUM=$[NUM + 1]
           postsuper -d $i
           echo "----" | tee -a /root/postfixtmp/queueclean.log
       else
           for o in * ; do
              if [ -f "$o" ]; then
                  if [ $o != $i ]; then
                     CURRENT=$(postcat $o | grep -A 1 "To:")
                     if [ "$CURRENT" = "$IDENT" ]; then
                        echo " duplicate email, deleting $o *" | tee -a /root/postfixtmp/queueclean.log
                        echo $CURRENT | tee -a /root/postfixtmp/queueclean.log
                        NUM=$[NUM + 1]
                        postsuper -d $o
                        echo "----" | tee -a /root/postfixtmp/queueclean.log
                     fi
                  fi
              fi
           done
      fi
   fi
done
echo "Deleted $NUM emails" | tee -a /root/postfixtmp/queueclean.log

Recovering VMs that were on local storage after removing host from XenServer pool

When you remove a host from a XenServer pool, the host gets reinitialized, so any VMs running locally get lost.  Luckily, it’s not too hard to recover the vdis from lvm.  Here’s an outline of the steps with some links that have more info / specific commands.

  1. If you can, join the host back to the pool and connect to your shared storage; this way you get the vms (that were moved to the pool when you added the host) and the vdis and only have to match the two together at the end
  2. Navigate to /etc/lvm/backup and find the file with the previous lvm data (the logical volumes should have all of your old vdis / snapshots, and it should have the relevant device path eg /dev/sda3)
  3. Find the current physical volume uuid
  4. Backup the /etc/lvm directory
  5. Modify the old volume group file and replace the old physical volume uuid with the current one
  6. Detach the local storage SR from the XenServer (see link below)
  7. Use vgcfgrestore to restore the old volume group file
  8. If you do vgscan you should see the newer volume group replaced with the old one (the name will be the same as the old one)
  9. Attach local storage SR to the XenServer with the current volume group name
  10. Create a new pbd with the SCSI ID and plug it in (see link below)
  11. Scan the new SR, it should pick up the old vdis but without any meta data.  If you create a new VM and attach these one-by-one as secondary disks, mount them to the new VM and check what they are, then you can rename them and attach back to your vms (that should be sitting in your pool).
  12. Move all the vdis you need over to your new SR, then you can remove your host again

 

Resources
Getting physical volume uuid and finding and modifying the file: http://support.citrix.com/article/CTX128097
Removing SR: http://support.citrix.com/article/CTX131328
Adding back local storage as an SR: http://support.citrix.com/article/CTX121896

Adding mongo-10gen to apt-cacher (and Ubuntu)

On the server:

Add the following line to /etc/apt-cacher/apt-cacher.conf:
path_map = mongodb-10gen http://downloads-distro.mongodb.org/repo/ubuntu-upstart

Download the key and serve to clients (I rather add the key to the repo server and have clients download it from there, than connect out and get from the internet):
gpg –keyserver keyserver.ubuntu.com –recv-keys 7F0CEB10
gpg –armor –export 9958C967 > mongodb-10gen.pub
python -m SimpleHTTPServer 8000

 

On client:

Create file /etc/apt/sources.list.d/10gen.list with the following contents:
deb http://your.apt-cacher.hostname:3142/mongodb-10gen dist 10gen

Download key from repo server:
wget http://your.apt-cacher.hostname:8000/mongodb-10gen.pub
apt-key add mongodb-10gen.pub
apt-get update

That should do it.  Then you can stop the python web server on the repo server.

Migrator Dragon for SharePoint 2013 fixing crash on ‘increase max upload file size on server’

When trying to upload files using this tool, the max upload size is 3MB (mentioned here: http://gallery.technet.microsoft.com/office/The-Migration-Dragon-for-c0880e59#content)

To increase, you need to use this button on the tool, but it was crashing for me with the following error:

Description: The process was terminated due to an unhandled exception.
Exception Info: Microsoft.SharePoint.Administration.SPUpdatedConcurrencyException

In addition to this, I was getting lots of other errors from SharePoint:

The Execute method of job definition Microsoft.SharePoint.Diagnostics.SPDiagnosticsMetricsProvider (ID 7f18b8c7-49aa-45f2-8826-67ecff862c1a) threw an exception. More information is included below.

An update conflict has occurred, and you must re-try this action….

These two errors are linked, and the solution is described here: http://support.microsoft.com/kb/939308 (although the details were slightly different on my installation, Win Server 2008 R2 and SP 2013)

To fix, you need to stop the SharePoint Timer Service, clear the configuration cache (at C:\ProgramData\Microsoft\SharePoint\Config\[guid], one folder has XML files, the other persistedfiles) by deleting all the XML files (not the folder, also the KB article mentions not to remove the cache.ini file and to edit it, but I didn’t have one), and restart the SharePoint Timer Service.  The article also mentions to run a config refresh from SP admin, but I couldn’t find this, so didn’t do it, and the fix worked anyway.  Might need to restart IIS as well.

 

(Also note, I think the max you can set the button is the value that you set for the application max file upload.  SharePoint 2013 has a hard limit of 2047 MB, so you can put this value in both the SharePoint web application settings and Migrator Dragon and you’ll be able to upload large files up to 2GB.  To change in SP, Central Administration > Manage Web Applications > select your application and go to General Settings >  Maximum upload size)