Kamranicus

Personal and development blog of Kamran Ayub

about me

Hi, my name is Kamran. I am a web developer and designer, residing in Minnesota. I’ve been programming since 2001 and I am familiar with many different languages, both web-based and server-based. I love to tinker around and think of new ways to solve problems.

OpenWeatherMap PowerShell Module

There’s no better to way to learn something new than to make a thing. Yesterday I had a strong desire to know what the weather was like from my command prompt. I can’t explain it—I just had to and I thought, “Hey, it can’t be that bad, right?” I decided to try writing and releasing an open-source PowerShell module.

written in OSS, PowerShell Read on →

Demystifying TypeScript at Midwest.js

Today I was excited to give a talk at Midwest.js about Demystifying TypeScript. This is not the first time I’ve given the talk, it’s actually the fourth time. Instead of leaving it stale I revamped the theme and updated all the information to be current as of today—including information about Babel.js and Flow.

Unfortunately due to technical difficulties (no HDMI-to-DisplayPort adapter!) the talk’s video wasn’t recorded. However, I was mic’d and I did use SnagIt to record my screen, so we’re hoping we can merge the audio and video tracks to generate a recording. If not, I may need to sit down and record audio for the talk.

I will update this post after the conference and let you know the status—let’s hope we can get it working! Otherwise, I guess you just had to be there ;)

If you attended and have questions, feel free to leave them here or tweet at me!

written in Conferences, Javascript, Talks, TypeScript

ASP.NET Core Sample Demo Game

At General Mills we do semi-annual code camps where the developer organization gets together for a half-day of talks and fun. This past code camp myself and my partner in crime, Erik Onarheim gave a talk around ASP.NET Core. It’s part of our roadmap to be familiar with hosting ASP.NET Core so we wanted to build something and showcase to developers what’s changed in Core vs. the typical Framework application.

We made a trivial demo game built on top of SignalR and .NET Core while also showing off other new features of the stack, including:

  • Depdendency Injection
  • Custom Tag Helpers
  • Cascading Configuration
  • Multi-Environment Support
  • Strongly-Typed Options
  • Injecting Services into Views
  • Injected MVC Filters
  • Bundling & Minification
  • Publishing to Azure

We had the demo running on Azure during the talk so people could join the game and we even attempted showing Linux support, though web sockets were not behaving nicely behind nginx. The game is not really a game but more of a showcase of using web sockets to allow some real-time multiplayer server action. It is definitely not how you’d implement a “real” multiplayer game but it’s a fun demo.

You can check out the source code on GitHub: https://github.com/eonarheim/aspnet-core-demogame. It’s commented pretty heavily to help understand the different parts, it’s based on the default out-of-the-box web template (dotnet new -t web).

We may or may not upload the talk onto YouTube since there wasn’t really anything specific about our work until the very end, which we can strip out without losing any important bits.

written in .NET Core, ASP.NET, C#, MVC, SignalR

Adding Subresource Integrity Support to Cassette .NET

If you aren’t familiar with Subresource Integrity, it’s a browser-based security measure to protect embedded content like scripts and stylesheets using a file content hash to help protect against XSS attacks.

For example, let’s say you’re including a script from a CDN:

<script src="https://mycdn.com/jquery/1.0/jquery.js"></script>

Then let’s say the CDN is compromised and instead of returning jquery, the script returns some malicious code that could compromise your site. Even if you’re using Content Security Policy (CSP), you won’t be protected because you whitelisted the CDN.

Subresource Integrity allows you to put a hash of the file’s contents in an attribute of the tag. The browser will then hash the contents of the response from the CDN and compare it against the attribute provided. If the hashes don’t match, the browser won’t include the response and will throw an error.

<script src="https://mycdn.com/jquery/1.0/jquery.js" integrity="sha256-hfhsh02929fhgh303yg"></script>

Integrating with Cassette

I use Cassette to perform my bundling/minification and I also host my assets on a CDN. Even though they are my own assets, I still want to ensure they are served securely and take advantage of SRI.

For third-party scripts, it is fairly easy to take advantage of SRI by hashing the contents online and customizing the CDN reference in Cassette:

bundles.AddUrl("http://mycdn.com/jquery/1.0/jquery.js", bundle =>
    bundle.HtmlAttributes.Add("integrity", "sha256-jquerysfilehash"));

But since my own files are dynamic, how can we still leverage Cassette and automatically hash the file contents when outputting to the page?

Luckily, Cassette is pretty extensible and includes a way to customize the bundle pipeline. So what we can do is essentially override the rendering of the HTML and add the integrity tag to the output.

To make this easy, I’ve created an open source Nuget package called Cassette.SubresourceIntegrity. All you do is install the package and that’s it. Since Cassette automatically scans for bundle customizations, all I did was implement a class InsertIntoPipelineSubresourceIntegrity and modify the pipeline to replace a couple parts with SRI-aware code.

The meat of the change is this code here:

string integrity;

using (var stream = asset.OpenStream())
{
    using (var sha256 = SHA256.Create())
    {
        integrity = $"integrity=\"sha256-{Convert.ToBase64String(sha256.ComputeHash(stream))}\"";
    }
}

return $"<script src=\"{_urlGenerator.CreateAssetUrl(asset)}\" " +
       $"type=\"text/javascript\" " +
       $"{integrity}{bundle.HtmlAttributes.ToAttributeString()}></script>";

I am just getting the asset stream and hashing the contents using SHA256, then adding the attribute to the output. You’ll notice that the URLs are not changed, so Cassette will continue to use SHA1 hashes internally. It’s only when rendering we use SHA256 because that’s the only place we need it.

While the code is interesting, it’s nothing too crazy—in fact, most of the code required is because Cassette doesn’t expose certain needed classes used in the rendering pipeline so I had to basically copy/paste a lot of the helper classes.

The end result

Now Cassette will automatically include SRI hashes for individual assets:

<link href="cassette.axd/asset/Content/bootstrap/bootstrap-ext.css?cabc6264a89106c4b9021c293cfa5c2cae7a0549" 
    integrity="sha256-sNfA6O5zvZPmMJ474pm2w6UyZbz5tfukxTEZXrsLm7Q=" type="text/css" rel="stylesheet"/>
<link href="cassette.axd/asset/Content/modules/typeahead.css?00581b47ff3848da273d91c31adb8270e9ef8707" 
    integrity="sha256-W6JAiwRw2ER1QoXjXL/YxsY/On1Y7MhW4TtoWY0XuH8=" type="text/css" rel="stylesheet"/>
<link href="cassette.axd/asset/Content/modules/toastr.css?32e90a136e05728ac23995ff8fe33077df9f50ca" 
    integrity="sha256-JT6UwDlczdRDx+9mnMCzvxwABJP0cSDgNLmT+WumJrQ=" type="text/css" rel="stylesheet"/>
<link href="cassette.axd/asset/Content/hopscotch/hopscotch.css?58ea04e54df958c33cf9e6bfed9f39a166354e9c" 
    integrity="sha256-Bq06LI6L0XMhxF+CoJo+4L12w2Orsbh2oRjOZ+fpmWc=" type="text/css" rel="stylesheet"/>
<link href="cassette.axd/asset/Content/core.css?a3b4fcb8b7d9b0e8465a4fea29d60247ea47fd87" 
    integrity="sha256-fAqyFLkOx1cFONu7NXX3c7/G1DSmXeHgtPtcWU72a4E=" type="text/css" rel="stylesheet"/>
<link href="cassette.axd/asset/Content/library.css?2c2746a086737dc588e313c0cc2c5adf8b947605" 
    integrity="sha256-SaP9kdYfbafIVes+qntAiDLPsi4JaXnit4eN6IfU9lA=" type="text/css" rel="stylesheet"/>

and bundles:

<link href="cassette.axd/stylesheet/ba58f2a04873e41b6a599274ea6768db1a61a650/Content/core" 
    integrity="sha256-thzkrIApz9dAI9nfJGleO1jbNFXXVT/BxoSynI2pEPw=" type="text/css" rel="stylesheet"/>
<link href="cassette.axd/stylesheet/2c2746a086737dc588e313c0cc2c5adf8b947605/Content/library.css" 
    integrity="sha256-6LgYbxu4UwouRBqvUdHZAQc0lewdik6aZYpDgrtAWJ4=" type="text/css" rel="stylesheet"/>

Voila!

written in C#, Nuget, OSS, Security

Designing Hexshaper, a Game for the Ludum Dare 35 Game Jam

This last weekend I took part in the global Ludum Dare 35 game jam. If you’ve been following me for awhile, you know I’ve participated in the past too. This time we made a game called Hexshaper—where the goal is to fly around, absorbing magic to seal portals to another dimension to prevent monsters from overtaking the world. The backstory, while not communicated directly, informed our design—but it wasn’t like that at first.

written in Excalibur.js, Game Jams, Games, Javascript, Ludum Dare, TypeScript Read on →

TypeScript in Action at Twin Cities Code Camp 20 (TCCC20)

Update (4/20/16): The presentation is now up on YouTube (and the slides).


I’ll be speaking at 8:30am this weekend at Twin Cities Code Camp. My talk will about using TypeScript in the context of an Angular 2 ASP.NET MVC web application but will focus on how TypeScript enhances my productivity and showcases some features of the language. I’ve done the talk previously internally at General Mills and had a great response so I thought I’d piggyback on the success of my previous Demystifying TypeScript talk (which is also on YouTube).

If you’re at all interested in seeing how TypeScript development looks in action, you should attend and I’d recommend going through my previous talk if you’re still not convinced TypeScript is awesome. This talk assumes you are at least open to the idea of developing in TypeScript and are curious to see how it can be used.

It’ll be a fun time and a busy weekend—I’ll have to leave the conference right after my talk to participate in Ludum Dare, where I’ll be helping to build a game in 72 hours. I’m sad I’ll miss the speaker happy hour and the prizes but it’s for a good reason! Hope to see you Saturday!

written in Conferences, JavaScript, Talks, TypeScript

Generating an Encryption Certificate for PowerShell DSC in WMF5

I’m currently building out a PowerShell DSC pull server cluster at work. If you aren’t familiar with DSC, I’ll talk more about it in an upcoming post that ties it all together. The long and short of it is that DSC is a way to store configuration as code and automate the configuration of many servers at once.

In the recent Windows Management Framework 5 release, Microsoft has improved its support and feature set for DSC but with a new release comes new surprises. The first surprise you may run into, as we did, was that your old WMF4 way of encrypting MOF files doesn’t work. In WMF5, the requirements for the certificate used to secure MOF files is stricter. Taken from MSDN:

  1. Key Usage:
  2. Must contain: ‘KeyEncipherment’ and ‘DataEncipherment’.
  3. Should not contain: ‘Digital Signature’.
  4. Enhanced Key Usage:
  5. Must contain: Document Encryption (1.3.6.1.4.1.311.80.1).
  6. Should not contain: Client Authentication (1.3.6.1.5.5.7.3.2) and Server Authentication (1.3.6.1.5.5.7.3.1).

If you read my previous foray into certificates with Azure Key Vault, you know I’m pretty green when it comes to certificate management and terminology. I really didn’t know what this stuff meant—I mean, I understand a certificate has key usages and enhanced key usages, but how does it get them? It has to do with the certificate request and the template used to provision your certificate.

It turns out Microsoft recommends obtaining a certificate from Active Directory Certificate Services. That’s cool, but I’m just a developer who wants to work on DSC, I don’t have an ADCS server to give me certificates during testing—that’s a different team altogether and when they’re primary guy is out of the office, I’m a bit stuck.

Update (4/13): TechNet now has a guide on how to generate certificates for WMF5. I’m leaving the rest of this post as-is for posterity.


I thought I could maybe use a self-signed certificate while I wait for a “for real” one later. After searching around for a method to create a certificate with the required KU and EKU specs, I found a lot of answers suggesting using OpenSSL. I’ve never used OpenSSL before so I thought I’d give it a try and I found it a bit confusing—I think I could have gotten it to work but instead I came across a random PowerShell article (unrelated to anything) using a utility called certreq that could handle providing custom key usages, problem solved!

You just need to create a file to define your certificate settings, MyCert.inf:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
[Version]
Signature = “$Windows NT$”

[Strings]
szOID_ENHANCED_KEY_USAGE = “2.5.29.37”
szOID_DOCUMENT_ENCRYPTION = “1.3.6.1.4.1.311.80.1”

[NewRequest]
Subject = “cn=me@example.comMachineKeySet = false
KeyLength = 2048
KeySpec = AT_KEYEXCHANGE
HashAlgorithm = Sha1
Exportable = true
RequestType = Cert

KeyUsage = “CERT_KEY_ENCIPHERMENT_KEY_USAGE | CERT_DATA_ENCIPHERMENT_KEY_USAGE”
ValidityPeriod = “Years”
ValidityPeriodUnits = “1000”

[Extensions]
%szOID_ENHANCED_KEY_USAGE% = “{text}%szOID_DOCUMENT_ENCRYPTION%”

Just change the Subject line to whatever you need in your case.

Then execute certreq using the input file:

certreq -new MyCert.inf MyCert.cer

Certreq should be available if you have Makecert—if you aren’t finding it in the default command prompt, try using the Visual Studio Command Prompt. Once you execute the command it will generate a public key file and install the private/public key pair into your CurrentUser personal certificate store:

PS> dir Cert:\CurrentUser\My

From there, you can export the private/public keys and install it on your DSC nodes.

Example screenshot

Until you get a signed certificate from your CA, this should work. Hope that helps!

written in DSC, PowerShell, Security

See You at Build 2016

I love attending conferences. At the end of the month I’ll be attending Build 2016. Build is the annual Microsoft conference. At the end of January me and some friends attended PAX South in San Antonio and it was a blast (not for work). A gaming convention has a different air than a developer conference. There’s a certain energy and it’s very motivating being around fellow developers, learning new tech, and seeing what’s new. I’ve attended Build several times in the past and it’s always been fun—especially when you meet coding “celebs”. Usually I take my wife even though she doesn’t attend the conference, we will typically stay longer. This year, not only will she be joining but our friend will be coming as well—both of them will be traipsing around San Francisco while I and a few coworkers will attend the conference. My partners in crime, Erik & Alan, will be joining me so it’ll be a ton of fun (yes, we all work together and yes we just went to PAX together). Going to a conference is fun, going to a conference with your friends is even more fun.

After Build we’ll be leaving my coworkers behind and we’ll be flying up to Portland to explore the area. I’ve been to downtown Seattle for PAX Prime before but I’ve only been in the surrounding area once when I was a teenager and I always thought it was beautiful. My wife never has been to the Pacific northwest so we thought it’d be fun to spend a few days extra and drive around. We’re all looking forward to it.

If you’re at Build and you follow me, be sure to send me a tweet and we’ll meet up!

written in Azure, Conferences, Microsoft

Handling Multiple Origins in CORS Using URL Rewrite

Here’s a quick tip if you’re trying to figure out how to handle cross-origin requests (CORS) when you have multiple origins (namely, HTTP and HTTPS). This works in IIS 8.0 and above, including Azure, as long as you have the URL Rewrite module installed.

The CORS header looks like this:

1
Access-Control-Allow-Origin: http://mydomain.com

The spec is very strict. The header can only return a single value and it must be absolutely qualified, which means if you have a site that is served over HTTP and HTTPS (or multiple domains), you need to dynamically build this header in your response. Many tutorials and blog posts say to specify * as the value—DO NOT DO THIS! This means any origin (domain) can embed/request assets from your website. Unless you have hundreds of sites doing this (aka CDN), you should only whitelist the domains that can include resources from your site.

If you are sharing resources with a known number of hosts, the following method will help. If it’s a dynamic list, you will need to programmatically add the Access-Control-Allow-Origin header depending on the incoming Origin header—something I won’t cover here.

Rather than messing with C# and modifying outgoing responses what I ended up using was a simple URL rewrite rule, proposed by this Stack Overflow answer. All it does is add a header to the outbound response when the regular expression matches—in this case, whitelisting only the HTTP and HTTPS version of my domain (or subdomain).

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
<system.webServer>
   <httpProtocol>
     <customHeaders>
         <add name=“Access-Control-Allow-Headers” value=“Origin, X-Requested-With, Content-Type, Accept” />
         <add name=“Access-Control-Allow-Methods” value=“POST,GET,OPTIONS,PUT,DELETE” />
     </customHeaders>
   </httpProtocol>
   <rewrite>          
<outboundRules> <clear />
<rule name=“AddCrossDomainHeader”> <match serverVariable=“RESPONSE_Access_Control_Allow_Origin” pattern=“.*” /> <conditions logicalGrouping=“MatchAll” trackAllCaptures=“true”> <add input=“{HTTP_ORIGIN}” pattern=“(http(s)?:\/\/((.+.)?mydomain.com))” /> </conditions> <action type=“Rewrite” value=“{C:0}” /> </rule>
</outboundRules> </rewrite> </system.webServer>

This is using special syntax of the URL Rewrite module (RESPONSE_) to add a outgoing response header (dashes replaced with underscores). Then it matches the incoming Origin header, compares the value, and if it matches includes the CORS header with the value of my domain.

That was all I had to do!

Note: Since I just converted over to always SSL, I no longer need this workaround but multiple origins is pretty common when dealing with CORS so this solution will come in handy.

written in Azure, C#, Security