Friday, April 29, 2016

Introduction to Caching in JavaScript and NodeJS

Disclaimer:  This article is intended for new web developers who don't know much about caching and would like to have a basic idea.  It is not a comprehensive post, nor is it supposed to accurately reflect advanced caching systems.  All of the scripts referenced in this post can be downloaded from my GitHub Repository.

When I was first starting out in web development, I heard people talk a lot about this "cash" system that you could use in websites and your code, but I honestly didn't know too much about it.  I eventually figured out it was a way to store data so you could access it faster, but I still thought it was some complicated architecture or a special system, but I was really just over thinking it.

A Cache is simply a place in memory where you can store data that is accessed often.  In the context of JavaScript, this can simply be a global variable that you put the results of an AJAX call into.  Then you create your function to look in that first, and if something is there, return that, otherwise make the original call you wanted to.  There are a lot of libraries out there that will do that for you but we're going to build a very basic caching system, and walk through each of the parts.

I'm going to use NodeJS for this blog post, but this works just as well in the browser with an AJAX request or something similar.

First, I'll create a function that calls out to the github API and returns a value.  I'll walk through the function and then add some caching to show how it's done.  Below we have the basic script, which you can run with:

node nocache.js



For this tutorial, I am using the npm library request for ease of use and am making a request to the GitHub API to query my repositories for the data about them.  This is very similar to a request you would use in Express or another web framework and I have done things very similar for other projects.  One thing to note is that the request library lets you specify the options easily for the HTTP request and the GitHub API requires a User-Agent in the header, so that's why that's there.  If you don't GitHub will return an error and reject your request.

Next on line 15 I created the function to make the request.  For this tutorial I have a bunch of logs and start and end times to track the time in milliseconds the entire request takes.  So I set the start to +Date.now() which just converts Date.now to a number (in ms), and then logged the start time.  The data in the body comes as a JSON string, so I parsed that and logged the name first item since it's an array of information.

Finally, there are some logs to output the ending time, and the total time elapsed for the request.  On average I get roughly 400 ms on a good connection.  And if this is inside a web request, you don't want to be adding half a second to a second to every request, on top of everything else the server is doing.  To simulate this, the script nocache_multi.js has a few setTimeouts to repeatedly call the same function.  As you can see to the right when you run it, each time you get a similar response time.

This script is the perfect location to add caching because it's not changing the request parameters and we can pretty much expect that the response will be the same every time at least most of the time.  So instead of making the request every time the function is run, I'm going to add a storage object so that I can store the response and use that when the function is called again.

In the script below, you can see I've added a very basic cache named repo_cache on line 14, and added a result field to the object to store the data.  In a bit you'll see why I split it into a separate field but for now, but you can see how it's being used below.  On line 25 I added a check to see if we had any result data, if so, we simply log the results from that data and return, otherwise we continue with the original process.  In addition I split out the logging into a separate function so we can call it from each path.  The last change I made was that when I successfully get data, to store the parsed result in the repo_cache.result object.


When you run this function, you'll see that the first request takes some time, and then the next three are almost instantaneous.  Here's what my output looked like to the right.

As you can see the first request duration was 440 ms, and the rest were zero because we had the data in memory.

So I successfully "cached" the response and had a much better response time thereafter.  But we have a problem with this.  This kind of cache isn't very useful for a couple reasons.  First is that the data is going to get stale after a while and if the web server stays running for a while the data will be inaccurate, so there needs to be some kind of way to invalidate the cache, or turn it back off.  Well that's pretty easy to do, we just need to store a timestamp of when we generated the data for the cache as well as a timeout and then if the timestamp + the timeout is less than the current time, we make a new request and refresh the cache.

Here is a snippet of the script cache_advanced.js from the GitHub repo below:

The changes I made were adding a last_updated and timeout field to the cache object, as well as checking those 2 during the cache check, and updating the last_updated field after the request.

I got the following results to the right when I ran the script.  Since the timeout was set to 1 second and the time between requests was 1 second, I was able to cache the result for a single request, and then it was invalidated and refreshed and then accessed for the final function call.

So that's at a VERY basic level what caching is.  There are a lot of things you can do to improve it and learn more about caching like:  Using Function names and parameters to create a hash to store the results in and what not.  This function was just a quick and dirty way to create a cache for a script.  So hopefully if you're new to web development, that cleared up caching a bit and makes it a little easier to understand.

Feel free to download the Github Repository and grab all the scripts here:
https://github.com/WakeskaterX/Blog/tree/master/2016_04_Caching




Tuesday, April 12, 2016

JavaScript - Keeping your code DRY with Function Currying

If you are like me when I first heard about currying, you were probably thinking, well that sounds delicious, but what the heck is it?  Sadly, it is not the aromatic dish to the left, but it IS a super useful thing to know about in JavaScript, and really isn't that hard to understand and put into use once you get the hang of it.

There are a lot of really technical definitions and lots of great information out there about function currying, but if you haven't taken classes on algorithms or don't have a Math or CS degree, it can seem pretty confusing.  Function Currying is creating a function that returns another function with parameters set from the first function.  Ok, not so simple in words, but it's not that complex.  For an exact definition of it feel free to read up on it at Wikipedia, but since it's easier to just look at some code and see how it works, let's just take a look and see exactly what it is.

The Basic Example

To the right you can see a basic example that's often shown to demonstrate Functional Currying.  On line 3 you can see a very basic add function, it takes 2 inputs, and returns them added together.  That's simple, that makes sense, it works just like you'd expect it to.

On line 7 however, we have something that looks a bit different.  This is an example of function currying.  What is happening is that the function addCurried isn't returning a value, it's returning an entire function, and creating a scope, with Y set to the value used in the function.  On line 15 you can see an example of it being used to create a function called add_5 with the Y value in the function returned set to 5.  Then that function can be used, and it will work just like the add function, except that you only pass in the X value, because the Y is always set to 5.  You can also call it like addCurried(5)(3) to get 8 as well since addCurried returns a function which can then be called with the parameter (3).

If you don't know a ton about scope and closures in JavaScript and you're thinking:  How can this all be so?!  Go read You Don't Know JS - Scopes and Closures.  It will really take your understanding of JavaScript scopes to a new level (and it's free online).  Function currying isn't that scary at all, it's just a way to create a function with "preset values" so that you can reuse parts of a function.  This becomes extremely useful when building callback functions and helps keep your code DRY.

Practical Usage

There are TONS of uses for function currying, but one that I often employ is using function currying to create middleware or callbacks that can be modified for multiple functions.

For example, say you're using Express and you want to authenticate a user token against different scopes for different endpoints.  Middleware makes that super easy, but you could spend a lot of time with anonymous functions doing logic for each endpoint if you don't keep your code dry.

First let's take a look at a very basic section of code without function currying.  In this section of code below, you can see that I made 2 routes, and in each route, I'm checking a req.decoded_token.scopes value against a list of scopes required for this endpoint.  If the validation fails, the response redirects to another endpoint and doesn't allow the user to pass.

This works, but is messy and isn't very extensible, nor is it DRY at all.  An easy way to make this code look a lot nicer, is to use function currying, or to be more precise, partial function currying.  Since we have 2 values that change from endpoint to endpoint: the scopes to validate against, and the endpoint to redirect to, we can create a middleware creator to curry those values onto a common logic base.

With this second iteration of the code, you can see it looks a LOT nicer and certainly more DRY.  The function createMiddleware takes 2 parameters, our scopes we want to validate against, and the endpoint we want to redirect to, and returns a function which will be used as the middleware function for our application endpoints.  

Function Currying and Partial Currying can be extremely useful in JavaScript particularly when dealing with anonymous functions or callbacks, and I often employ it to build handlers and helpers that can create the anonymous functions I need while being flexible and letting me specify the parameters.

I hope these snippets of code were some examples of how you can use currying to keep your code DRY.  It is a really powerful skill to learn in JavaScript and having a good handle on how to use it properly will allow you to save a lot of time and effort.

Friday, April 8, 2016

Home Sweet Home

My wife and I recently bought a house (first time home owners!) and it has been a whirlwind of busy.  We went the fixer-upper route and bought an old home from the 50s, and pretty much ripped everything out and refinished the interior, doing much of the work ourselves.  We're hitting the home stretch and will be able to move in after four months of part time renovations, in a couple weeks.

When we bought the house, it was quaint, with a tiny kitchen, wallpaper EVERYWHERE (like... 3 layers worth), linoleum flooring in the kitchen, and a family room with wood paneling.  It was... pretty old fashioned as you can see:

The bones of the house were good though, and it was a home that was taken care of in a price range we could afford.  It has a small 1 car garage and a large back yard, which is good, because we have a young, rambunctious dog, who very much enjoys running around.

So, about 3 and a half months ago, we set to tearing the place apart.  Wallpaper removal, wall removal, floor removal, etc.  Demolition took around 2 months of spending our nights and weekends at the house.  We had a massive 40 yard dumpster in our driveway for a good month and a half of that too, which ended up leaving dents in the pavement.

One of the toughest parts was this room to the left.  That wood paneling was covering up another layer of wood paneling, which was covering up.... house siding.  Yup, this room was originally just a space between a standing garage and the original ranch house.  But instead of removing the siding when they connected the two to expand the ranch house, they simply put a layer of wood veneer over it.
Here you can see the old wood siding that was under 2 layers of that wood paneling.

This was probably the toughest room to rip apart.  We also spent a ton of time renovating the kitchen as well.  The old kitchen was a tiny little kitchen with a bedroom next to it that you had to walk through to get to the rest of the house.

So we decided to remove the bedroom, extend the kitchen out another 8 feet or so, and extend the 1 bathroom to be less miniscule.  Here are some progress photos:

We took the old kitchen, bedroom and bathroom:



And ripped EVERYTHING out

And tore up the old linoleum to lay down dark slate tile (we flew my old man out to help some)

And installed a beautiful kitchen and bathroom! (I don't take nearly enough pictures to get the progress well)



With real tile, painted walls, and new doors and trim.

So that's been taking up ALL of my free time, and why I haven't blogged much lately.  Also, my PC went kaput, so I need to buy some parts and a new hard drive and get it running again.  But this has been an insanely busy first quarter and it will likely stay busy for the rest of the year.  We have landscaping to do, the rest of the baseboard painting, installing a fence, tree removal... and we'll likely do most of it ourselves.

It's a blessing though, owning a home.  It really is great to be able to partake in the American Dream, especially coming from a family where my parents have never owned a home, I'll be the first among my parents and siblings.  

Honestly, I'm mostly looking forward to finishing up all the basic work so I can start integrating IoT stuff into the house.  The Amazon Echo looks pretty nifty, it'd be nice to pick one up and see what it's capable of.  Plus there are tons of new devices that people are inventing to connect to each other, it's pretty neat what's out there already.

So that's my home sweet home.  More of a personal post, but it's really a fun adventure, despite all the bone grinding and back aching work that goes into it.  (And I'm still under 30, why do I ache so much!)

Thanks for reading and hopefully I'll have some time to do some side projects once we move in and write about them.  Cheers!


Monday, February 1, 2016

GGJ 2016 - Cockamimey!



This past weekend was the Global Game Jam, a 48 hour blitz of game development & creation.  Participants can jam out to the yearly theme at one of hundreds of Jam sites from around the world.  First time and aspiring game developers meld with the local game dev scene to try and to create a playable game within the time limit.

My Jam location this year was at the New England Institute of Technology, having started a tradition with some friends of mine to do a new site each year.  It was a fun, exhausting, and brain melting time, but we were able to finish a pretty polished product within 48 hours.

The theme was "Ritual" and that first night we plopped down in a swimming pool in our hotel to do some aquatic brainstorming.  Floating around and throwing out every idea we could think of related to Ritual, we finally settled on doing a game about "Mating Rituals".  We knew we wanted to make a Gear VR game, so we settled on a game about Birds Dancing in order to win a mate.  Thus Cockamimey was born!

Cockamimey is a Simon Says type of game where you have to watch a competing bird, and copy their head movements and add your own to the chain in order to win the affection of the mate bird.  Players look around in VR and must watch carefully so they can mimic the moves.  The game was fun to make, despite running into a few hiccups along the way.  Originally we had wanted to do a multiplayer version of this (and we still might), but the first night I needed some additional software installed and the hotel internet was very slow, so I spent 4-5 hours just setting up my development environment before I could even get the Gear VR demo running.




But we did get it all set up, and Saturday was a blast of programming and development.  The other 2 people in the group: Ben & Dawn Taylor, are both phenomenal artists and handled all of the modeling, lighting and effects, and promotional art which was perfect as I handled all of the programming and level design.  One thing we did well was to keep the scope fairly small.  You only have to make six successful moves in a row to win the game, and the demo is short, sweet and gets the idea across.  And it's fun.

We had to keep the game small, and use as few effects as possible, as mobile VR can be rough on phones for rendering and heating.  We'll have to keep most of the scenes small like this, which works because the game consists of only three main characters.

The whole weekend was a blast and it was good to get back into Game development a bit as it has been a while since I've made anything game related.  If you have a Gear VR you can check out Cockamimey on the Google Play Store or at our Global Game Jam page, and if you check it out give us a holler and let us know what you think.

Tuesday, November 24, 2015

NodeJS Basics - Object Patterns & Differences

Ever wondered when you should use a Singleton vs. using an object instantiated for just that script or even just for that call?  Not knowing the scope of the object can cause all sorts of issues in a NodeJS application if you don't carefully consider when it gets instantiated and what is allowed to access it.

I'll quickly walk through a couple scenarios which look very similar, but have a much different impact on your code.  This may be obvious to you if you're a long time NodeJS developer, but it's something that has tripped me up a few times in the past.

First, let's take a look at the Singleton model and how it works.

The Singleton Model

In the NodeJS Handbook, Fred K. Schott introduces the NodeJS Singleton:
"In most languages, sharing an object across your entire application can be a complex process. These Singletons are usually quite complex, and require some advanced modifications to get working. In Node, however, this type of object is the default. Every module that you require is shared across your application, so there’s no need for any special classes or extra code."

In NodeJS we can very quickly and easily create singletons, but it's important to be careful of when to use them, so you don't get yourself in trouble.

Let's take a look at a sample function, which will be our singleton model using a basic Object Oriented Approach.  There are other ways to write Singletons as well, but I'm writing the Singleton this way to focus on the small differences between Singletons and Instantiated Objects and how to not get tripped up by them.


Above we have a basic function: MyClass, with a single attribute: name, and single function on that class: askName.  Once it's added to the module.exports it is globally available for any of your scripts to require.  The important thing to note here is that you are creating a new instantiated object that gets put on the global modules list:  new MyClass("testy1"). Any script that accesses it references the instantiated object.  So in a second script, you could access it like so:


And if you had a third script running in the application you could run it there as well:


Now this is the important thing to remember about Singletons, and the reason you use this pattern:  If you change the name in the first script on the same NodeJS process, it will change it in the second script as well.  So the following test script which runs both scripts, will output "Hello, my name is Bob" twice:


If you swap the order of these scripts, it will output "testy1" as the name and then "Bob".  So just make sure you remember that when you modify values in a Singleton, that it will have that effect on anything else that is accessing it.

When to Use

The Singleton pattern is particularly useful whenever you need to have something store values across all scripts.  Think of it as being "super global" to the node process.  But that also means that you shouldn't use it if you need to change things between web requests or want to have the object do different things in different areas of your process.

With this pattern, you don't need to create the objects every time.  When you require it, you're accessing the created and instantiated object, so you can use it's methods and values right away.

The Object Oriented Model

So what if you don't want to use the Singleton, or it doesn't suit your needs?  Well instead of instantiating the object on the module.exports, you can simply pass the constructor back and let the calling script instantiate it.

This fits more along the lines of traditional object oriented programming.  Rather than one single global object, you allow the calling script to create each object as needed.


The above script is exactly the same as the Singleton model with the exception of line 10.  Instead of passing in a new MyClass with a set name, we simply pass in the constructor, the function MyClass.  Then in our scripts, when we require it, we must instantiate it, creating the objects with their own values.


There are 2 ways I normally go about instantiating an object from a module.  The first is to simply require the class, as seen on line 2, and then instantiate it (line 3) separately or when you need it.  Or, if you know that you only need to instantiate it once at the start of the script, say in the case where you have an error library or a debugging library that has settings specific to an individual script, you can instantiate the require, as seen on line 5.

When to Use

The Object Oriented pattern is useful any time you want something to be local to your script or function.  One thing to watch out for is the scope of the object and where and when you instantiate a new one.  If you're going to be using something for a certain web request, make sure you instantiate it during that web request within the function.

One recent 'doh' moment I had was when I instantiated my object globally, but was using it in web requests that came in.  So while Node was handling multiple requests, the logic the object was doing was being accessed by 2 different processes and was being corrupted during the request.  Moving the instantiation into function where I handled the request solved the issue.

Looking back it seems like a simple mistake, but it threw me for a loop for a while trying to figure out how the data was being corrupted.

There are plenty of other patterns as well, but I wanted to focus on the very small change between a Singleton and a Constructor based pattern that can make all the world of difference in your NodeJS code if you don't realize how it's different.

Hopefully reading this saves you a headache in the future!  Thanks and if you enjoyed this article, please follow me on Twitter & Subscribe to the blog!  Cheers!

Wednesday, November 18, 2015

Node Card Creator - Sheetify JS - Automating Print Sheet Creation

The NodeJS Card Creator is finally coming together as a tool I can use to streamline the development process.  If you haven't read the first blog post, I encourage you to go check out how I built the Card Creator and how you can set up your environment to do the same in my post: Card Creator - Automating Card Creation in NodeJS  You can also view the entire project here, at my GitHub account:  https://github.com/WakeskaterX/CardCreator

The next big piece -- a simple, but important piece of the automation process -- was finished just the other day: Sheetify.JS.  This Sheetify script takes all of the cards that are created in the cards folder and groups them into groups of 9, which fit onto a standard 8.5" x 11" sheet of paper.  It then draws the cards to these sheets and outputs them to a sheets folder to be ready for printing.  Having this change it allows me to rapidly prototype by changing values, running 2 scripts and then printing the desired print sheets.  There is of course, the cutting out of the cards, so perhaps I can build some kind of cutting stencil, but in the mean time this has vastly improved the prototype speed.

Here is an example of the output sheet:


In addition to Sheetify the other thing I updated in the script was the ability to specify card images for the artwork section.  I made a few shitty drawings in Paint.NET to test it, but it works very well.

Here is an example of a Fire Spell with the image added to the card:

And that's pretty much it!  The sheet automation has been super helpful, so the next step is finding an artist to work with me on the project and playtest, playtest, playtest to hammer down the mechanics!

Monday, November 2, 2015

Updates to the Card Creator & Work on Sorceror's Arena

I've got some updates to the Card Creator application out (which you can find on GitHub here:  https://github.com/WakeskaterX/CardCreator ) as well as some updates to a card game I'm working on tentively named Sorceror's Arena.

Now I'm no amazing designer, but I managed to improve the image quality a bit and have a slightly better design for the automated card creation.  Before I was doing a lot of the card creating by hand in Paint.NET and it certainly gets tedious as I make small changes and need to create demo print sheets.  Reducing that workload was the purpose of the Card Creator and it's finally starting to look better than the cards I've created by hand which is why I haven't been using it until now.

So here is a quick look at what my currently printed prototypes look like, what the old generated cards looked like, and the recent change I just made to the card creator.

Current Prototype

Old Generated Image

New Generated Image

Now it is a little hard to tell in the two generated images, but the Icon fidelity went up greatly, and the fonts look better too.  Before the generated images didn't have any kind of indicator for rarity, and so that was added as well to have some differentiation in the cards.

There is still a lot to do, I need images, background art, a decent background for the description box, and I need to create the JSON files for the Lightning and Water sets, all just to make a first decent prototype.  That said, I may still hold off on images until I can find an actual artist to work with.

Little by little I find time to work on it and polish the game, in addition to the play testing and adjusting card values so nothing is TOO overpowered.  The project is still very early so it'll be a long time before any kind of release ever happens.