Kamranicus

Personal and development blog of Kamran Ayub

about me

Hi, my name is Kamran. I am a web developer and designer, residing in Minnesota. I’ve been programming since 2001 and I am familiar with many different languages, both web-based and server-based. I love to tinker around and think of new ways to solve problems.

TypeScript in Action at Twin Cities Code Camp 20 (TCCC20)

Update (4/20/16): The presentation is now up on YouTube (and the slides).


I’ll be speaking at 8:30am this weekend at Twin Cities Code Camp. My talk will about using TypeScript in the context of an Angular 2 ASP.NET MVC web application but will focus on how TypeScript enhances my productivity and showcases some features of the language. I’ve done the talk previously internally at General Mills and had a great response so I thought I’d piggyback on the success of my previous Demystifying TypeScript talk (which is also on YouTube).

If you’re at all interested in seeing how TypeScript development looks in action, you should attend and I’d recommend going through my previous talk if you’re still not convinced TypeScript is awesome. This talk assumes you are at least open to the idea of developing in TypeScript and are curious to see how it can be used.

It’ll be a fun time and a busy weekend—I’ll have to leave the conference right after my talk to participate in Ludum Dare, where I’ll be helping to build a game in 72 hours. I’m sad I’ll miss the speaker happy hour and the prizes but it’s for a good reason! Hope to see you Saturday!

written in Conferences, JavaScript, Talks, TypeScript

Generating an Encryption Certificate for PowerShell DSC in WMF5

I’m currently building out a PowerShell DSC pull server cluster at work. If you aren’t familiar with DSC, I’ll talk more about it in an upcoming post that ties it all together. The long and short of it is that DSC is a way to store configuration as code and automate the configuration of many servers at once.

In the recent Windows Management Framework 5 release, Microsoft has improved its support and feature set for DSC but with a new release comes new surprises. The first surprise you may run into, as we did, was that your old WMF4 way of encrypting MOF files doesn’t work. In WMF5, the requirements for the certificate used to secure MOF files is stricter. Taken from MSDN:

  1. Key Usage:
  2. Must contain: ‘KeyEncipherment’ and ‘DataEncipherment’.
  3. Should not contain: ‘Digital Signature’.
  4. Enhanced Key Usage:
  5. Must contain: Document Encryption (1.3.6.1.4.1.311.80.1).
  6. Should not contain: Client Authentication (1.3.6.1.5.5.7.3.2) and Server Authentication (1.3.6.1.5.5.7.3.1).

If you read my previous foray into certificates with Azure Key Vault, you know I’m pretty green when it comes to certificate management and terminology. I really didn’t know what this stuff meant—I mean, I understand a certificate has key usages and enhanced key usages, but how does it get them? It has to do with the certificate request and the template used to provision your certificate.

It turns out Microsoft recommends obtaining a certificate from Active Directory Certificate Services. That’s cool, but I’m just a developer who wants to work on DSC, I don’t have an ADCS server to give me certificates during testing—that’s a different team altogether and when they’re primary guy is out of the office, I’m a bit stuck.

Update (4/13): TechNet now has a guide on how to generate certificates for WMF5. I’m leaving the rest of this post as-is for posterity.


I thought I could maybe use a self-signed certificate while I wait for a “for real” one later. After searching around for a method to create a certificate with the required KU and EKU specs, I found a lot of answers suggesting using OpenSSL. I’ve never used OpenSSL before so I thought I’d give it a try and I found it a bit confusing—I think I could have gotten it to work but instead I came across a random PowerShell article (unrelated to anything) using a utility called certreq that could handle providing custom key usages, problem solved!

You just need to create a file to define your certificate settings, MyCert.inf:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
[Version]
Signature = “$Windows NT$”

[Strings]
szOID_ENHANCED_KEY_USAGE = “2.5.29.37”
szOID_DOCUMENT_ENCRYPTION = “1.3.6.1.4.1.311.80.1”

[NewRequest]
Subject = “cn=me@example.comMachineKeySet = false
KeyLength = 2048
KeySpec = AT_KEYEXCHANGE
HashAlgorithm = Sha1
Exportable = true
RequestType = Cert

KeyUsage = “CERT_KEY_ENCIPHERMENT_KEY_USAGE | CERT_DATA_ENCIPHERMENT_KEY_USAGE”
ValidityPeriod = “Years”
ValidityPeriodUnits = “1000”

[Extensions]
%szOID_ENHANCED_KEY_USAGE% = “{text}%szOID_DOCUMENT_ENCRYPTION%”

Just change the Subject line to whatever you need in your case.

Then execute certreq using the input file:

certreq -new MyCert.inf MyCert.cer

Certreq should be available if you have Makecert—if you aren’t finding it in the default command prompt, try using the Visual Studio Command Prompt. Once you execute the command it will generate a public key file and install the private/public key pair into your CurrentUser personal certificate store:

PS> dir Cert:\CurrentUser\My

From there, you can export the private/public keys and install it on your DSC nodes.

Example screenshot

Until you get a signed certificate from your CA, this should work. Hope that helps!

written in DSC, PowerShell, Security

See You at Build 2016

I love attending conferences. At the end of the month I’ll be attending Build 2016. Build is the annual Microsoft conference. At the end of January me and some friends attended PAX South in San Antonio and it was a blast (not for work). A gaming convention has a different air than a developer conference. There’s a certain energy and it’s very motivating being around fellow developers, learning new tech, and seeing what’s new. I’ve attended Build several times in the past and it’s always been fun—especially when you meet coding “celebs”. Usually I take my wife even though she doesn’t attend the conference, we will typically stay longer. This year, not only will she be joining but our friend will be coming as well—both of them will be traipsing around San Francisco while I and a few coworkers will attend the conference. My partners in crime, Erik & Alan, will be joining me so it’ll be a ton of fun (yes, we all work together and yes we just went to PAX together). Going to a conference is fun, going to a conference with your friends is even more fun.

After Build we’ll be leaving my coworkers behind and we’ll be flying up to Portland to explore the area. I’ve been to downtown Seattle for PAX Prime before but I’ve only been in the surrounding area once when I was a teenager and I always thought it was beautiful. My wife never has been to the Pacific northwest so we thought it’d be fun to spend a few days extra and drive around. We’re all looking forward to it.

If you’re at Build and you follow me, be sure to send me a tweet and we’ll meet up!

written in Azure, Conferences, Microsoft

Handling Multiple Origins in CORS Using URL Rewrite

Here’s a quick tip if you’re trying to figure out how to handle cross-origin requests (CORS) when you have multiple origins (namely, HTTP and HTTPS). This works in IIS 8.0 and above, including Azure, as long as you have the URL Rewrite module installed.

The CORS header looks like this:

1
Access-Control-Allow-Origin: http://mydomain.com

The spec is very strict. The header can only return a single value and it must be absolutely qualified, which means if you have a site that is served over HTTP and HTTPS (or multiple domains), you need to dynamically build this header in your response. Many tutorials and blog posts say to specify * as the value—DO NOT DO THIS! This means any origin (domain) can embed/request assets from your website. Unless you have hundreds of sites doing this (aka CDN), you should only whitelist the domains that can include resources from your site.

If you are sharing resources with a known number of hosts, the following method will help. If it’s a dynamic list, you will need to programmatically add the Access-Control-Allow-Origin header depending on the incoming Origin header—something I won’t cover here.

Rather than messing with C# and modifying outgoing responses what I ended up using was a simple URL rewrite rule, proposed by this Stack Overflow answer. All it does is add a header to the outbound response when the regular expression matches—in this case, whitelisting only the HTTP and HTTPS version of my domain (or subdomain).

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
<system.webServer>
   <httpProtocol>
     <customHeaders>
         <add name=“Access-Control-Allow-Headers” value=“Origin, X-Requested-With, Content-Type, Accept” />
         <add name=“Access-Control-Allow-Methods” value=“POST,GET,OPTIONS,PUT,DELETE” />
     </customHeaders>
   </httpProtocol>
   <rewrite>          
<outboundRules> <clear />
<rule name=“AddCrossDomainHeader”> <match serverVariable=“RESPONSE_Access_Control_Allow_Origin” pattern=“.*” /> <conditions logicalGrouping=“MatchAll” trackAllCaptures=“true”> <add input=“{HTTP_ORIGIN}” pattern=“(http(s)?:\/\/((.+.)?mydomain.com))” /> </conditions> <action type=“Rewrite” value=“{C:0}” /> </rule>
</outboundRules> </rewrite> </system.webServer>

This is using special syntax of the URL Rewrite module (RESPONSE_) to add a outgoing response header (dashes replaced with underscores). Then it matches the incoming Origin header, compares the value, and if it matches includes the CORS header with the value of my domain.

That was all I had to do!

Note: Since I just converted over to always SSL, I no longer need this workaround but multiple origins is pretty common when dealing with CORS so this solution will come in handy.

written in Azure, C#, Security

Securing Secrets Using Azure Key Vault and Config Encryption

Secrets. We all have them. I’m talking about secrets like your database connection strings, API keys and encryption keys. Where are you storing yours? Are you storing them…

  1. In your application’s source code?
  2. In a config file (appSettings or otherwise) checked into source control?
  3. In a database?
  4. In a managed portal, like Azure?

I hope you aren’t storing them hardcoded. You’re probably doing option 2 or a hybrid of options 2-4. Even if you use an external data source, it’s hard to escape the need for secrets in local development unless you force your app to rely on having connectivity which makes it hard to work offline.

In this post I’m going to provide some suggestions on how to store your secrets better using Azure Key Vault and config file encryption, specifically in the context of Azure but the concepts apply to any hosting environment.

written in Azure, C#, Cloud, Encryption, Secrets, Security Read on →

Tools of the Trade 2016

Sometimes you get so caught up in the work you do on a daily basis that you forget what it was like to start your job on day one—not knowing anything about what tools, extensions, and general utilities you take for granted now, 6 years into your career. It seems like on a monthly basis I find a new extension or utility that is useful to me. I wanted to share my toolbelt, in case it contains something you’ve never heard of and causes you to exclaim in excitement about something awesome that you’ll start using today.

written in C#, Development, Tools, Visual Studio Read on →

Planet Wars AI Competition With C# and Excalibur.js

Planet Wars

This past weekend Erik and I built out a Planet Wars server (written in C#) and an Excalibur.js-powered visualization (written in TypeScript). Planet Wars is an AI competition where you build an AI that competes against another player to control a solar system. A map consists of several planets that have different growth rates and an initial number of ships. You have to send out a “fleet” of ships to colonize other planets and the player who controls the most planets and has destroyed their opponent’s ships wins the game.

At work we are hosting our 6th Code Camp and recently we started hosting an AI competition internally. You can find past competition agents for Ants and Elevators, for example.

The visualization for Planet Wars is fairly simple, made even simpler using the power of Excalibur.js, the engine we work on during our spare time. We basically just use an Excalibur timer to query the status of the game state and update the state of all the actors in the game. For moving the fleets, we just use the Actor Action API.

For the game server, we are using a HighFrequencyTimer to run a 30fps server and then clients just send commands via HTTP, so any kind of agent will work like Python, Perl, PowerShell, or whatever! Anything that speaks HTTP can be a client. The server runs in the context of a website so we can easily query the state using a singleton GameManager. This wouldn’t work in a load-balanced environment but it doesn’t matter since people develop agents locally and we run the simulations on one server at high-speed to produce the results. If you backed the server with a data store, you could replay games but right now there’s only an in-memory implementation.

To keep the server and client models in-sync, we use Typewriter for Visual Studio which is amazing and super useful not just for syncing client/server but also generating web clients, interfaces, etc. from C# code. I plan to write a separate post on some Typewriter tips for Knockout.js and Web API.

written in AI, C#, Excalibur.js, Games, Javascript, Typescript, Typewriter

2015: A Year in Review

2015 was a very eventful (and fulfilling) year for me and my wife. Let’s break it down, shall we?

Living abroad for 6 months

Bordeaux, France

By far the most impressive thing I did last year was to take a 6 month sabbatical and live abroad in France with my wife. Though I’ve written about it previously, I left out the entire part where we chronicled our adventure in a series of publications on Medium. We kept it anonymous during the trip to avoid any potential issues but now that it’s over, I will list the different publications so you can read back through what we did for 6 months (spoiler: we did a lot).

Just to be clear when you’re reading, I am Vincent and my wife is Celeste.

It was an experience I’ll never forget and one that probably won’t be repeated anytime soon. My wife and I both felt it was the right time and that we’d probably get little to no chance at doing something so crazy once we had kids and “settled down.” We still hope to continue traveling once a year or couple years, especially after an experience like that. One of my plans for 2016 is to compile all these posts into a book that we print and keep for us and our future children.

New House

Bay window

We weren’t in a position to buy a house so soon after a 6 month sabbatical but we still thought it was best to move from apartment living to a real house, especially after living in a 400 sq ft space in France. We found a great place to rent in Minneapolis that’s pretty close to both workplaces, friends, and family. We’ve done a few things to it to make it more like home and we’ve been really enjoying it so far and our landlord is superb. The photo above is our enhancement to the bay window. My brother-in-law built the spanning bench between the bookcases and I built the cushion. I removed the tall blinds that covered the window so we could open up the room and add extra seating. It turned out so good!

New Dog

Dogger

My wife has always wanted a dog ever since we moved into an apartment together—except our apartment complex never let us have dogs. We cat-sat (is that a word?) for 2 years for some friends and then they took her back down to Texas where they bought a house and she happily frolicks outside. In August (Dogust?) we went to the humane society and on pretty-much-a-whim took in a cute dog we named Rennes (after one of our favorite French cities we visited). She’s a black lab and border collie mix. She’s awesome even though she jumps the fence to chase squirrels (we’re working on that). We love her a ton.

Keep Track of My Games

KTOMG

2015 marked the 4th year that KTOMG has been around since its humble beginnings and while I was abroad it gave me time to focus and finish a major rewrite of the codebase in May. Since then I’ve released public lists and capping the year off with Steam syncing, just to name a few features.

Speaking

Even though I was abroad for 6 months, I still managed to give a talk this year at Twin Cities Code Camp 19: an update to my popular Demystifying TypeScript presentation. You can also find 2014’s version on YouTube.

Making Games

Minotaur

In August, I participated in the Ludum Dare 33 game jam where some friends and I created a minotaur hack-n-slash game, Crypt of the Minotaur (source). I love participating in game jams and by extension, helping to contribute to the Excalibur.js game engine.

Playing Games

Somehow after all that I still managed to log hundreds of hours into my gaming habit. Since I added public lists to KTOMG, why don’t you go take a look at my Top 10 Played Games of 2015? Yes, some of those came out in 2014 but I didn’t play or finish them until this year. Being abroad, I managed to bring my laptop, 3DS, and PS4 so I played a lot of Destiny, finished the remastered Grim Fandango and other PS4/3DS games. My laptop wasn’t that great but I was able to still enjoy Pillars of Eternity, a throwback Baldur’s Gate-style RPG. In November, I started playing Fallout 4 and have since logged over 75 hours in it. It’s definitely tough to juggle both hobbies: playing games and developing a site that helps manage those games. During time off, I usually try to mix my time between them to satisfy both needs and sticking to a monthly release cadence helps a lot to prioritize work.

Work & Friends

My work has been going swimmingly, after my sabbatical I returned to work on a team with one of my best friends. Speaking of friends, I made more this year, fulfilling a goal I made at the start of 2015—not only abroad but also at home. Board game nights, Dungeon World sessions, and a Star Wars marathon are just some of the highlights of the fun stuff we’ve done with our [awesome] circle of friends.

Looking towards a new year

Cheers to 2016, let’s hope it’s even bigger and better than 2015 and brings more happiness and joy to my life.

written in Accomplishments, Life, Year in Review

Influencing Your Kudu Deployment Through Git Commit Messages

If you’re on Windows Azure and using continuous deployment through Git, you may not know you are using an open source platform called Kudu behind-the-scenes that performs your deployment. If this is the first time you’ve heard of Kudu and you’ve been using Azure for awhile, it’s time to get acquainted. Kudu is amazing. It has a whole REST API that lets you manage deployments, trigger builds, trigger webjobs, view processes, a command prompt, and a ton more.

You can get to your Kudu console by visiting

https://<yoursite>.scm.azurewebsites.net

The .scm. part is the key, as that is where the Kudu site is hosted.

Customizing Kudu deployments

One of the other things it offers is a customized deployment script. I’ve customized mine because I have a test project where I run automated tests during the build. This is useful since it’ll fail the build if I make any changes that break my tests and forces me to keep things up-to-date resulting in a higher quality codebase.

If you want to generate your own script, it’s pretty easy. Just follow the steps outlined here. For example, after customizing my script here’s what my section looks like to run my tests:

1
2
3
4
:: 3. Build unit tests
call :ExecuteCmd “%MSBUILD_PATH%” “%DEPLOYMENT_SOURCE%\src\Tests\Tests.csproj” /nologo /verbosity:m /t:Build /p:AutoParameterizationWebConfigConnectionStrings=false;Configuration=Release /p:SolutionDir=“%DEPLOYMENT_SOURCE%.\” %SCM_BUILD_ARGS%

IF !ERRORLEVEL! NEQ 0 goto error

All I really did was copy step 2 in the script that builds my web project and just change the path to my tests project.

Finally, I run the tests using the packaged Nunit test runner (checked into source control):

1
2
call :ExecuteCmd “%DEPLOYMENT_SOURCE%\tools\nunit\nunit-console.exe” “%DEPLOYMENT_SOURCE%\src\Tests\bin\Release\Tests.dll” /framework:v4.5.1
IF !ERRORLEVEL! NEQ 0 goto error

Simple!

Now the fun part

One thing you’ll notice if you start running tests on your builds is that this starts to slow down your continuous deployment workflow. For 90% of the time this is acceptable, after all, you can wait a few minutes to see your changes show up on the site. But sometimes, especially for production hotfixes or trial-and-error config changes, that 3-5 minutes becomes unbearable.

In cases like this, I’ve set up a little addition to my script that will read the git commit message and take action depending on what phrases it sees.

For example, let’s say I commit a change that is just a config change and I know I don’t need to run any tests or I really want the quick build. This is what my commit message looks like:

[notest] just changing App.config

That phrase [notest] is something my script looks for at build time and if it’s present it will skip running tests! You can use this same logic to do pretty much anything you want. Here’s what it looks like after step 3 in my script:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
:: Above at top of file

IF NOT DEFINED RUN_TESTS (
   SET RUN_TESTS=1
)

:: 4. Run unit tests
echo Latest commit ID is “%SCM_COMMIT_ID%”

call git show -s “%SCM_COMMIT_ID%” —pretty=%%%%s > commitmessage.txt
SET /p COMMIT_MESSAGE=<commitmessage.txt

echo Latest commit message is “%COMMIT_MESSAGE%”

IF NOT “x%COMMIT_MESSAGE:[notest]=%”==“x%COMMIT_MESSAGE%” (
   SET RUN_TESTS=0
)

IF /I “%RUN_TESTS%” NEQ “0” (
  echo Running unit tests
  call :ExecuteCmd “%DEPLOYMENT_SOURCE%\tools\nunit\nunit-console.exe” “%DEPLOYMENT_SOURCE%\src\Tests\bin\Release\Tests.dll” /framework:v4.5.1
  IF !ERRORLEVEL! NEQ 0 goto error
) ELSE (
  echo Not running unit tests because [notest] was present in commit message
)

Alright, there’s definitely some batch file black magic incantations going on here! So let’s break it down.

echo Latest commit ID is "%SCM_COMMIT_ID%"

Kudu defines several useful environment variables that you have access to, including the current commit ID. I’m just echoing it out so I can debug when viewing the log output.

call git show -s "%SCM_COMMIT_ID%" --pretty=%%%%s > commitmessage.txt
SET /p COMMIT_MESSAGE=<commitmessage.txt

Alright. This took me some real trial and error. Git lets you show any commit message and can format it using a printf format string (--pretty=%s). However, due to the weird escaping rules of batch files and variables, this requires not one but four % signs. Go figure.

Next I pipe it to a file, this is only so I can read the file back and store the message in a batch variable (COMMIT_MESSAGE), on the next line. Kudu team: It would be sweet to add a SCM_COMMIT_MESSAGE environment variable!

IF NOT "x%COMMIT_MESSAGE:[notest]=%"=="x%COMMIT_MESSAGE%" (
   SET RUN_TESTS=0
)

Okay, what’s going on here? I’ll let StackOverflow explain. The :[notest]= portion REPLACES the term “[notest]” in the preceding variable (COMMIT_MESSAGE) with an empty string. The x prefix character guards against batch file weirdness. So if [notest] is NOT present, this will return true (the strings match). If it is present, the condition will be false and so we do IF NOT since we want to execute when that is the case.

If [notest] is present in the message, we set another variable, RUN_TESTS to 0.

IF /I "%RUN_TESTS%" NEQ "0" (
    echo Running unit tests
    call :ExecuteCmd "%DEPLOYMENT_SOURCE%\tools\nunit\nunit-console.exe" "%DEPLOYMENT_SOURCE%\src\Tests\bin\Release\Tests.dll" /framework:v4.5.1
    IF !ERRORLEVEL! NEQ 0 goto error
) ELSE (
    echo Not running unit tests because [notest] was present in commit message
)

If RUN_TESTS does not evaluate to 0, then we run the tests! Otherwise we echo out an informative message as to why it was skipped.

Phew. So how much time do we save on [notest] builds now?

No test build

Compared to a build with tests:

Build with tests

So that flag cuts the build in half! Nice! There are probably some other ways to improve the time. By the way, if you’re wondering what’s taking so long in your build, you can use the Kudu REST endpoint to see your deployment logs (/api/deployments endoint) which contain full timestamp information!

Happy continuous deployment!

written in Continuous Deployment, Continuous Integration, Git, Kudu, Testing, Windows Azure