Kamranicus

Personal and development blog of Kamran Ayub

about me

Hi, my name is Kamran. I am a web developer and designer, residing in Minnesota. I’ve been programming since 2001 and I am familiar with many different languages, both web-based and server-based. I love to tinker around and think of new ways to solve problems.

Working With Packt Publishing to Create a TypeScript Video Course

Packt

I have some great news to share! Last week I signed on the dotted line to publish a TypeScript course for Packt Publishing. I’m very excited for this opportunity—as you know, I’m a big fan of TypeScript and I talk about it a lot. This is the first video course I’ll have worked on though I’m no stranger to screencasts. I’m pretty pumped and I think I have some good material to share.

I hope to share more about the recording process as I dive into it—the first order of business is rigging up all the recording equipment that is sitting in my basement collecting dust and testing it. I already have a recording mic, shock mount, and preamp, for example. They are years old now so I may need to get some new equipment but I’m crossing my fingers that most of it will produce a good sound. My Sound Blaster X-Fi Elite Pro that I bought years ago is only half-working—it seems like it’ll output sound for a bit and then crap out, but I’m hoping that is only limited to output and that it can still record input. The X-Fi external box supports some of the standard audio plugs I’d prefer using. It may end up I’ll have to go entirely USB, we’ll see.

The course should be ready by mid-February and will be geared towards introducing people to TypeScript. I plan to use practical code to introduce concepts and the final volume of the course should have 2-3 small projects we’ll build from scratch.

written in Courses, Publishing, Recording, TypeScript

Thoughts on Upgrading to a Dedicated Raven Standard Instance

As you may know, I use the NoSQL RavenDB database to power my hobby project Keep Track of My Games. I really like the development workflow and the way it has simplified a lot of my data access logic. Right now I use the hosted RavenHQ solution for my two databases (staging and production). I’ve been doing that for some time now for a few reasons:

  • The regular yearly Standard RavenDB subscription of $700 is a bit much for a hobby project
  • I realized too late you could pay quarterly/one-time. One-time licenses only grant upgrades for 18 months and since major version upgrades seem to happen beyond that window, it is hard to justify even the one-time payment long term.
  • The Basic/Basic 2X pricing would have been perfect except the 2GB database size limit was a dealbreaker. Additionally, I’d still need to add that monthy cost to the monthly cost of a VM.
  • When I started with HQ the price was low enough to warrant the latency. Now though, the $10/1GB overage is getting to me especially with the two replicated instances BOTH incurring the $10/1GB extra each.

The Cloud is Expensive and Slow [for Me]

This last week though, I decided to see if I could figure out how to lower my database hosting costs and the number one way to do that was to get a Standard license and manage my own VM.

The hard truth is that PaaS offers awesome managed services and convenience but you pay (out the nose) for that convenience and often performance comes at a high cost so you’re stuck with the smallest possible pricing tier otherwise your costs double and triple easily. When you want to save money, you need to go IaaS but even the major cloud platforms like Azure, AWS, and Google aren’t super affordable for a hobby project when you want performance. Anything above 1 core on most cloud platforms is a minimum of about $100. You can’t really expect to run Raven on a single core, 1GB machine—you need some metal.

I Can Do It Myself

Enter Vultr.com. Have a look at their pricing for a second and then come back…

Back? I KNOW, RIGHT?!

I discovered Vultr from reading Rick Strahl’s woes on Azure VM performance (see a pattern?) and he switched to Vultr VMs and was a happy camper. I have a 2 Core, 2GB VM spun up right now and I’m not even hosting my site/DB yet but it still feels way faster than an Azure VM.

If I could find a way to get a Standard Raven license I could move all my stuff over to much cheaper infrastructure and get 2-3x the performance for less price. Yes, I may sacrifice some high availability and it will be more management overhead but managing web sites is what I do and I know how to monitor them plus even if I got a $36 2 Core 2GB web instance AND a $56 4 Core 4GB DB instance it would still be cheaper than my current hosting cost that has way less performance. Not to mention I could migrate over to .NET Core-based site and host on an even cheaper Linux VM. Raven 4 will also be able to run on Linux. That is some nice cost cutting!

The Magic Non-Commercial License

So I had to pony up $1000 for a Standard license if I wanted to pay it forward and reduce my long-term hosting cost. It’s hard enough to justify this to myself let alone to my wife! Yes, I’ve paid more than that overall so far but that’s a sunk cost—it’s tough to pay a lump sum like that all at once. I was willing to swallow that pill after a month or two of saving up.

But, I wondered if there was some other way. I kept thinking that I’m not making money from KTOMG, I wish that would count as an OSS license.

So I went back to the RavenDB pricing page. At the bottom it says “Open Source? Apply”. This is great—I love it when awesome products like Resharper, RavenDB, and others are free for OSS projects. The issue is that KTOMG is not an OSS project. It’s non-commercial, but it’s not OSS. So obviously I initially ignored the offer years ago to apply for an OSS license.

I decided to Google more about RavenDB licensing—OSS projects can use it under AGPL but everyone else needs a commercial license, right? Well, yes… but:

NC: Blah I hate OSS, I don’t understand all this stuff. I mean if i create a community site thats not designed to make money other than advertising to cover server costs. I don’t want to make my site open source to use OSS, does that mean I have to pay for a commercial license?

Ayende: Yes. Or, for your scenario, you could contact us and ask for a freebie license in return for something like a “Powered by RavenDB” logo.

This was in the comments on Ayende’s blog post on RavenDB licensing.

Whaaaaat?!

It turns out that if you ask nicely and have a truly non-commercial project, you may qualify for a “freebie” license in exchange for some free press or a watermark (I filled out a testimonial and shared my thoughts with the team). I wish I knew that 3 years ago! I would have done that from the beginning had I known it was a possibility—and I bet more people out there like me didn’t know it was a possibility. How could they? The pricing page doesn’t mention this at all. It could be that they don’t want to advertise it as more people would ask—but I think that’s a good thing.

It’s true that RavenHQ was able to extract money out of me for awhile and that benefited them—but it wasn’t going to sustain itself. My options were to: a) fork out for a Standard license in which I pay once or b) explore other options like Marten for PostgreSQL. There may be reasons people would move to Postgres and all that but I am comfortable with Raven and what it gives me so I’m not exactly looking to migrate all my data access code (again) to another provider. That isn’t value-added work to me, I want to focus on releasing features!

I think that the more folks that use Raven, the more word will spread around. If people are not using Raven because they think they have to pay $700/yr out-of-pocket for a non-commercial project, they could be attracted back if they knew they were still able to harness the power of Raven with a free non-commercial license. Perhaps not everyone will want to manage their own VM but people like me working on fun projects that want a robust NoSQL database for .NET would jump at the chance.

If anyone at Hibernating Rhino’s is reading this, consider adding the non-commercial license as an “official” available license like OSS. It might be good for business!

written in Databases, Keep Track of My Games, RavenDB

OpenWeatherMap PowerShell Module

There’s no better to way to learn something new than to make a thing. Yesterday I had a strong desire to know what the weather was like from my command prompt. I can’t explain it—I just had to and I thought, “Hey, it can’t be that bad, right?” I decided to try writing and releasing an open-source PowerShell module.

written in OSS, PowerShell Read on →

Demystifying TypeScript at Midwest.js

Today I was excited to give a talk at Midwest.js about Demystifying TypeScript. This is not the first time I’ve given the talk, it’s actually the fourth time. Instead of leaving it stale I revamped the theme and updated all the information to be current as of today—including information about Babel.js and Flow.

Unfortunately due to technical difficulties (no HDMI-to-DisplayPort adapter!) the talk’s video wasn’t recorded. However, I was mic’d and I did use SnagIt to record my screen, so we’re hoping we can merge the audio and video tracks to generate a recording. If not, I may need to sit down and record audio for the talk.

I will update this post after the conference and let you know the status—let’s hope we can get it working! Otherwise, I guess you just had to be there ;)

If you attended and have questions, feel free to leave them here or tweet at me!

written in Conferences, Javascript, Talks, TypeScript

ASP.NET Core Sample Demo Game

At General Mills we do semi-annual code camps where the developer organization gets together for a half-day of talks and fun. This past code camp myself and my partner in crime, Erik Onarheim gave a talk around ASP.NET Core. It’s part of our roadmap to be familiar with hosting ASP.NET Core so we wanted to build something and showcase to developers what’s changed in Core vs. the typical Framework application.

We made a trivial demo game built on top of SignalR and .NET Core while also showing off other new features of the stack, including:

  • Depdendency Injection
  • Custom Tag Helpers
  • Cascading Configuration
  • Multi-Environment Support
  • Strongly-Typed Options
  • Injecting Services into Views
  • Injected MVC Filters
  • Bundling & Minification
  • Publishing to Azure

We had the demo running on Azure during the talk so people could join the game and we even attempted showing Linux support, though web sockets were not behaving nicely behind nginx. The game is not really a game but more of a showcase of using web sockets to allow some real-time multiplayer server action. It is definitely not how you’d implement a “real” multiplayer game but it’s a fun demo.

You can check out the source code on GitHub: https://github.com/eonarheim/aspnet-core-demogame. It’s commented pretty heavily to help understand the different parts, it’s based on the default out-of-the-box web template (dotnet new -t web).

We may or may not upload the talk onto YouTube since there wasn’t really anything specific about our work until the very end, which we can strip out without losing any important bits.

written in .NET Core, ASP.NET, C#, MVC, SignalR

Adding Subresource Integrity Support to Cassette .NET

If you aren’t familiar with Subresource Integrity, it’s a browser-based security measure to protect embedded content like scripts and stylesheets using a file content hash to help protect against XSS attacks.

For example, let’s say you’re including a script from a CDN:

<script src="https://mycdn.com/jquery/1.0/jquery.js"></script>

Then let’s say the CDN is compromised and instead of returning jquery, the script returns some malicious code that could compromise your site. Even if you’re using Content Security Policy (CSP), you won’t be protected because you whitelisted the CDN.

Subresource Integrity allows you to put a hash of the file’s contents in an attribute of the tag. The browser will then hash the contents of the response from the CDN and compare it against the attribute provided. If the hashes don’t match, the browser won’t include the response and will throw an error.

<script src="https://mycdn.com/jquery/1.0/jquery.js" integrity="sha256-hfhsh02929fhgh303yg"></script>

Integrating with Cassette

I use Cassette to perform my bundling/minification and I also host my assets on a CDN. Even though they are my own assets, I still want to ensure they are served securely and take advantage of SRI.

For third-party scripts, it is fairly easy to take advantage of SRI by hashing the contents online and customizing the CDN reference in Cassette:

bundles.AddUrl("http://mycdn.com/jquery/1.0/jquery.js", bundle =>
    bundle.HtmlAttributes.Add("integrity", "sha256-jquerysfilehash"));

But since my own files are dynamic, how can we still leverage Cassette and automatically hash the file contents when outputting to the page?

Luckily, Cassette is pretty extensible and includes a way to customize the bundle pipeline. So what we can do is essentially override the rendering of the HTML and add the integrity tag to the output.

To make this easy, I’ve created an open source Nuget package called Cassette.SubresourceIntegrity. All you do is install the package and that’s it. Since Cassette automatically scans for bundle customizations, all I did was implement a class InsertIntoPipelineSubresourceIntegrity and modify the pipeline to replace a couple parts with SRI-aware code.

The meat of the change is this code here:

string integrity;

using (var stream = asset.OpenStream())
{
    using (var sha256 = SHA256.Create())
    {
        integrity = $"integrity=\"sha256-{Convert.ToBase64String(sha256.ComputeHash(stream))}\"";
    }
}

return $"<script src=\"{_urlGenerator.CreateAssetUrl(asset)}\" " +
       $"type=\"text/javascript\" " +
       $"{integrity}{bundle.HtmlAttributes.ToAttributeString()}></script>";

I am just getting the asset stream and hashing the contents using SHA256, then adding the attribute to the output. You’ll notice that the URLs are not changed, so Cassette will continue to use SHA1 hashes internally. It’s only when rendering we use SHA256 because that’s the only place we need it.

While the code is interesting, it’s nothing too crazy—in fact, most of the code required is because Cassette doesn’t expose certain needed classes used in the rendering pipeline so I had to basically copy/paste a lot of the helper classes.

The end result

Now Cassette will automatically include SRI hashes for individual assets:

<link href="cassette.axd/asset/Content/bootstrap/bootstrap-ext.css?cabc6264a89106c4b9021c293cfa5c2cae7a0549" 
    integrity="sha256-sNfA6O5zvZPmMJ474pm2w6UyZbz5tfukxTEZXrsLm7Q=" type="text/css" rel="stylesheet"/>
<link href="cassette.axd/asset/Content/modules/typeahead.css?00581b47ff3848da273d91c31adb8270e9ef8707" 
    integrity="sha256-W6JAiwRw2ER1QoXjXL/YxsY/On1Y7MhW4TtoWY0XuH8=" type="text/css" rel="stylesheet"/>
<link href="cassette.axd/asset/Content/modules/toastr.css?32e90a136e05728ac23995ff8fe33077df9f50ca" 
    integrity="sha256-JT6UwDlczdRDx+9mnMCzvxwABJP0cSDgNLmT+WumJrQ=" type="text/css" rel="stylesheet"/>
<link href="cassette.axd/asset/Content/hopscotch/hopscotch.css?58ea04e54df958c33cf9e6bfed9f39a166354e9c" 
    integrity="sha256-Bq06LI6L0XMhxF+CoJo+4L12w2Orsbh2oRjOZ+fpmWc=" type="text/css" rel="stylesheet"/>
<link href="cassette.axd/asset/Content/core.css?a3b4fcb8b7d9b0e8465a4fea29d60247ea47fd87" 
    integrity="sha256-fAqyFLkOx1cFONu7NXX3c7/G1DSmXeHgtPtcWU72a4E=" type="text/css" rel="stylesheet"/>
<link href="cassette.axd/asset/Content/library.css?2c2746a086737dc588e313c0cc2c5adf8b947605" 
    integrity="sha256-SaP9kdYfbafIVes+qntAiDLPsi4JaXnit4eN6IfU9lA=" type="text/css" rel="stylesheet"/>

and bundles:

<link href="cassette.axd/stylesheet/ba58f2a04873e41b6a599274ea6768db1a61a650/Content/core" 
    integrity="sha256-thzkrIApz9dAI9nfJGleO1jbNFXXVT/BxoSynI2pEPw=" type="text/css" rel="stylesheet"/>
<link href="cassette.axd/stylesheet/2c2746a086737dc588e313c0cc2c5adf8b947605/Content/library.css" 
    integrity="sha256-6LgYbxu4UwouRBqvUdHZAQc0lewdik6aZYpDgrtAWJ4=" type="text/css" rel="stylesheet"/>

Voila!

written in C#, Nuget, OSS, Security

Designing Hexshaper, a Game for the Ludum Dare 35 Game Jam

This last weekend I took part in the global Ludum Dare 35 game jam. If you’ve been following me for awhile, you know I’ve participated in the past too. This time we made a game called Hexshaper—where the goal is to fly around, absorbing magic to seal portals to another dimension to prevent monsters from overtaking the world. The backstory, while not communicated directly, informed our design—but it wasn’t like that at first.

written in Excalibur.js, Game Jams, Games, Javascript, Ludum Dare, TypeScript Read on →

TypeScript in Action at Twin Cities Code Camp 20 (TCCC20)

Update (4/20/16): The presentation is now up on YouTube (and the slides).


I’ll be speaking at 8:30am this weekend at Twin Cities Code Camp. My talk will about using TypeScript in the context of an Angular 2 ASP.NET MVC web application but will focus on how TypeScript enhances my productivity and showcases some features of the language. I’ve done the talk previously internally at General Mills and had a great response so I thought I’d piggyback on the success of my previous Demystifying TypeScript talk (which is also on YouTube).

If you’re at all interested in seeing how TypeScript development looks in action, you should attend and I’d recommend going through my previous talk if you’re still not convinced TypeScript is awesome. This talk assumes you are at least open to the idea of developing in TypeScript and are curious to see how it can be used.

It’ll be a fun time and a busy weekend—I’ll have to leave the conference right after my talk to participate in Ludum Dare, where I’ll be helping to build a game in 72 hours. I’m sad I’ll miss the speaker happy hour and the prizes but it’s for a good reason! Hope to see you Saturday!

written in Conferences, JavaScript, Talks, TypeScript

Generating an Encryption Certificate for PowerShell DSC in WMF5

I’m currently building out a PowerShell DSC pull server cluster at work. If you aren’t familiar with DSC, I’ll talk more about it in an upcoming post that ties it all together. The long and short of it is that DSC is a way to store configuration as code and automate the configuration of many servers at once.

In the recent Windows Management Framework 5 release, Microsoft has improved its support and feature set for DSC but with a new release comes new surprises. The first surprise you may run into, as we did, was that your old WMF4 way of encrypting MOF files doesn’t work. In WMF5, the requirements for the certificate used to secure MOF files is stricter. Taken from MSDN:

  1. Key Usage:
  2. Must contain: ‘KeyEncipherment’ and ‘DataEncipherment’.
  3. Should not contain: ‘Digital Signature’.
  4. Enhanced Key Usage:
  5. Must contain: Document Encryption (1.3.6.1.4.1.311.80.1).
  6. Should not contain: Client Authentication (1.3.6.1.5.5.7.3.2) and Server Authentication (1.3.6.1.5.5.7.3.1).

If you read my previous foray into certificates with Azure Key Vault, you know I’m pretty green when it comes to certificate management and terminology. I really didn’t know what this stuff meant—I mean, I understand a certificate has key usages and enhanced key usages, but how does it get them? It has to do with the certificate request and the template used to provision your certificate.

It turns out Microsoft recommends obtaining a certificate from Active Directory Certificate Services. That’s cool, but I’m just a developer who wants to work on DSC, I don’t have an ADCS server to give me certificates during testing—that’s a different team altogether and when they’re primary guy is out of the office, I’m a bit stuck.

Update (4/13): TechNet now has a guide on how to generate certificates for WMF5. I’m leaving the rest of this post as-is for posterity.


I thought I could maybe use a self-signed certificate while I wait for a “for real” one later. After searching around for a method to create a certificate with the required KU and EKU specs, I found a lot of answers suggesting using OpenSSL. I’ve never used OpenSSL before so I thought I’d give it a try and I found it a bit confusing—I think I could have gotten it to work but instead I came across a random PowerShell article (unrelated to anything) using a utility called certreq that could handle providing custom key usages, problem solved!

You just need to create a file to define your certificate settings, MyCert.inf:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
[Version]
Signature = “$Windows NT$”

[Strings]
szOID_ENHANCED_KEY_USAGE = “2.5.29.37”
szOID_DOCUMENT_ENCRYPTION = “1.3.6.1.4.1.311.80.1”

[NewRequest]
Subject = “cn=me@example.comMachineKeySet = false
KeyLength = 2048
KeySpec = AT_KEYEXCHANGE
HashAlgorithm = Sha1
Exportable = true
RequestType = Cert

KeyUsage = “CERT_KEY_ENCIPHERMENT_KEY_USAGE | CERT_DATA_ENCIPHERMENT_KEY_USAGE”
ValidityPeriod = “Years”
ValidityPeriodUnits = “1000”

[Extensions]
%szOID_ENHANCED_KEY_USAGE% = “{text}%szOID_DOCUMENT_ENCRYPTION%”

Just change the Subject line to whatever you need in your case.

Then execute certreq using the input file:

certreq -new MyCert.inf MyCert.cer

Certreq should be available if you have Makecert—if you aren’t finding it in the default command prompt, try using the Visual Studio Command Prompt. Once you execute the command it will generate a public key file and install the private/public key pair into your CurrentUser personal certificate store:

PS> dir Cert:\CurrentUser\My

From there, you can export the private/public keys and install it on your DSC nodes.

Example screenshot

Until you get a signed certificate from your CA, this should work. Hope that helps!

written in DSC, PowerShell, Security