I wrote a poem for our work poetry contest for our holiday party, so here it is.
Heart of Winter
Angel feathers fall from a cotton sky
As the arctic wind pinches my strawberry nose.
I pull the wool scarf my mother knit, up around my face,
Blinking with frost laden lashes.
Cinnamon steam swirls from my mug,
Lifting with it the scent of apple, orange and brandy.
One sip extinguishes the chill, a warm hug from the inside out.
Inside children sit near a playful fire,
Bathed in its flickering warmth,
Reading stories of pirates and books of adventures in the clouds,
Hours slipping by without a care.
The tree in the corner nestles down on kaleidoscope boxes,
Her bows protecting them like a mother hen,
Dressed in a sparkling cacophony of color.
Night emerges and peaceful silence settles in,
Crackles and pops from a dying fire sparsely interrupting.
Back and forth the chair rocks, as eyelids droop and conscious wanes.
Sleep encroaches quietly and a carefully placed blanket becomes a warm embrace
In the heart of winter.
Wednesday, December 14, 2016
Tuesday, August 30, 2016
Schema Check
It has been a while since I've done a post about side projects, but lately I've been working on a little NPM Module called Schema Check, available on NPM and Github and wanted to share it. Schema Check is a light JavaScript object type enforcer which uses private properties and Object.defineProperty to create getters and setters for various fields on a JavaScript object.
Schema Check uses schema objects to describe the fields on the object and the types they should accept. It then modifies the getters and setters of that object to only allow values of that type to be set. This came out of a desire to have something that could be used to enforce some types in JS without resorting to something like TypeScript entirely. It's also just a fun side project I can work on here and there in my spare time.
Our dude, Chad, has three properties: name (a string), is_single (a boolean) and cash (a number). If we're manipulating data and working with the dude object in JavaScript, it'd be really nice to limit what we can set to those fields. There are many solutions for this, such as the rise of TypeScript, but Schema Check is a lightweight way to enforce these values with a simple schema.
By importing schema check like so and setting a schema for the dude object, we can force the object to throw an error (or fail silently) when an invalid value is set to that property.
I have a lot more I want to do with the library. At the moment it doesn't really support Arrays and there are extra restrictions I want to add as well (like Min/Max for numbers)
It's a work in progress, feel free to contribute to the GitHub repo and create PRs and Issues if you think of anything that should be added.
Schema Check uses schema objects to describe the fields on the object and the types they should accept. It then modifies the getters and setters of that object to only allow values of that type to be set. This came out of a desire to have something that could be used to enforce some types in JS without resorting to something like TypeScript entirely. It's also just a fun side project I can work on here and there in my spare time.
How it works
Schema Check at it's core is built on Object.defineProperty. By using defineProperty, schema check allows you to apply a schema to an existing object - which will force the fields of that object to conform to certain restraints.
Take for example, the following object:
Take for example, the following object:
var dude = { name: "Chad", is_single: true, cash: 100 };
Our dude, Chad, has three properties: name (a string), is_single (a boolean) and cash (a number). If we're manipulating data and working with the dude object in JavaScript, it'd be really nice to limit what we can set to those fields. There are many solutions for this, such as the rise of TypeScript, but Schema Check is a lightweight way to enforce these values with a simple schema.
By importing schema check like so and setting a schema for the dude object, we can force the object to throw an error (or fail silently) when an invalid value is set to that property.
var SchemaCheck = require('schema-check'); var schema = { name: { type: 'string', regex: /^[a-z]+$/i //Used by Regex.test() to check validity }, is_single: { type: 'boolean', is_editable: false //Poor chad - this status isn't changing any time soon }, cash: { type: 'number', allow_nulls: false //This can be true too if you want to
//allow null to be a valid entry } }; var options = { throw_error: false
//This will determine whether or not trying to set
//an invalid property type throws an error or fails silently }; SchemaCheck(dude, schema, options); dude.name = 5; //Fails - name is still Chad
dude.name = '123'; //Also Fails - because it doesn't match the regex.test dude.is_single = false; //Fails - Chad is still single dude.cash = 110; //Succeeds, Chad is a bit richer now!
I have a lot more I want to do with the library. At the moment it doesn't really support Arrays and there are extra restrictions I want to add as well (like Min/Max for numbers)
It's a work in progress, feel free to contribute to the GitHub repo and create PRs and Issues if you think of anything that should be added.
Sunday, July 3, 2016
All The World's A Game
All the world's a game,
And all the men and women merely players;
They have their PKs and their respawns,
And one man in his time plays many classes,
His act being seven stages. At first the neophyte,
Mewling and button smashing in wanton alarm,
Then the whining acolyte, with his light inventory
And shining face of wonder, crawling past monsters,
Unwilling to fight. And then the expert,
Intense like a furnace, a woe-filled blade
Made to conquer fiends. Then a raider,
Bags filled with strange trinkets and shrouded in epics,
Jealous of other's gear, sudden and quick to gank noobs,
Seeking to expand his reputation
Ever running his mouth. And then the veteran,
His house filled with grand loot and lined with treasure,
With eyes wizened by dragons and warlocks,
Full of strange tales of adventures past;
And so he plays his part. The sixth stage shifts
Into the quiet and skeptical sage,
Watching the spectacles of new ages from side to side;
His intense vigor, lost to a world too wide
For his excitement, and his boisterous voice,
Turning again to quiet wonder and solitude
He crafts on his own. Last scene of all,
That ends this strange and beautiful game,
Is the second desire for youthful remembrance,
Sans guild, sans friends, sans time, sans everything.
And all the men and women merely players;
They have their PKs and their respawns,
And one man in his time plays many classes,
His act being seven stages. At first the neophyte,
Mewling and button smashing in wanton alarm,
Then the whining acolyte, with his light inventory
And shining face of wonder, crawling past monsters,
Unwilling to fight. And then the expert,
Intense like a furnace, a woe-filled blade
Made to conquer fiends. Then a raider,
Bags filled with strange trinkets and shrouded in epics,
Jealous of other's gear, sudden and quick to gank noobs,
Seeking to expand his reputation
Ever running his mouth. And then the veteran,
His house filled with grand loot and lined with treasure,
With eyes wizened by dragons and warlocks,
Full of strange tales of adventures past;
And so he plays his part. The sixth stage shifts
Into the quiet and skeptical sage,
Watching the spectacles of new ages from side to side;
His intense vigor, lost to a world too wide
For his excitement, and his boisterous voice,
Turning again to quiet wonder and solitude
He crafts on his own. Last scene of all,
That ends this strange and beautiful game,
Is the second desire for youthful remembrance,
Sans guild, sans friends, sans time, sans everything.
Wednesday, June 8, 2016
Card Creator V2 - A smoother workflow
I had a couple goals in mind when I set out to rebuild the Card Creator. First, I wanted to be able to update a Google Spreadsheet document, and have the card creator read directly from there for building the cards, and second, I wanted to remove the Cairo dependency, because it wasn't working very well and I wanted to use Node4 which broke node-canvas.
Version 2 of the Card Creator does exactly that. I hooked up the Card Creator into Google Sheets using the npm module: google-spreadsheet, and I swapped out node-canvas and Cairo for LibGD and node-gd. What that entailed was a full rebuild of the card creation logic, but it was good to go back over it and update it and smooth out some of the minor issues.
In addition I redrew much of the art for the icons so that I could use non-pixel art instead, which scales better. Granted, I am no artist, but I am doing all of the temporary art until I can find an artist to work with, if I decide to pursue this project seriously.
If you haven't read my previous post. Card Creator is an application written in NodeJS that automates creating playing cards for a card game I'm working on. It allows me to rapidly prototype and handles design, combining art assets, etc into print sheets for ease of production.
This new Card Creator program has improved my workflow significantly. With the old program, making card changes required me to update the JSON files individually and change the values for each key/value pair in order to generate the cards. Now I simply create a new row in my google document and run the program and it pulls in all the changes. It's really nice for rapid prototyping. I'm able to tweak values, update the descriptions, and then immediately run the program to generate the cards, and the sheetify script to create print sheets.
Version 2 of the Card Creator does exactly that. I hooked up the Card Creator into Google Sheets using the npm module: google-spreadsheet, and I swapped out node-canvas and Cairo for LibGD and node-gd. What that entailed was a full rebuild of the card creation logic, but it was good to go back over it and update it and smooth out some of the minor issues.
In addition I redrew much of the art for the icons so that I could use non-pixel art instead, which scales better. Granted, I am no artist, but I am doing all of the temporary art until I can find an artist to work with, if I decide to pursue this project seriously.
If you haven't read my previous post. Card Creator is an application written in NodeJS that automates creating playing cards for a card game I'm working on. It allows me to rapidly prototype and handles design, combining art assets, etc into print sheets for ease of production.
This new Card Creator program has improved my workflow significantly. With the old program, making card changes required me to update the JSON files individually and change the values for each key/value pair in order to generate the cards. Now I simply create a new row in my google document and run the program and it pulls in all the changes. It's really nice for rapid prototyping. I'm able to tweak values, update the descriptions, and then immediately run the program to generate the cards, and the sheetify script to create print sheets.
The script has a few points of interest you'll want to consider when using it. First it's run by using the command: node creator-google-sheets.js which is the primary file. (Make sure you're on branch version2). The first thing you need to do to run it is include a google spreadsheet key to a google document. In order to use the sheet, you'll need to publish it to the web, and then you'll have a link that has a long string in it. That's your spreadsheet key.
I put my key in a private.js file which isn't included in the repository, so you can do the same, or you can simply replace the require('./private.js').google_sheets_key; on line 15 of the script with the string. In addition if your spreadsheet has differently named headers for your spreadsheet, you'll need to update those in the mapping function (convertRow) on line 73. That basically maps the row data to fields that the icon creator, background creator, etc are expecting.
The Card Creator works similar to before, it draws the background, the title, the description and the icons and then outputs it as a png file. One thing to note, is that if you're working with node-gd, the documentation isn't fantastic, so it might take some trial and error to figure out how the functions work exactly. For example, it took me a little bit to figure out which copy function worked for copying one image onto another. Use copyResampled. At least that's the only one that worked well for what I was doing.
I'm also rebranding the Card game I'm working on a bit, tossing some names around trying to find one I like. I have a couple in mind, so we'll see if they stick in play testing.
Anyway feel free to check out the new Card Creator on my Github Page, if you're working with NodeJS linking up to google sheets documents is really quite simple and fantastic for automating work. You can make the document read only (like I did) or allow programs to write to cells as well. Nifty!
Cheers and thanks for reading!
Jason C / WakeskaterX
Friday, April 29, 2016
Introduction to Caching in JavaScript and NodeJS
Disclaimer: This article is intended for new web developers who don't know much about caching and would like to have a basic idea. It is not a comprehensive post, nor is it supposed to accurately reflect advanced caching systems. All of the scripts referenced in this post can be downloaded from my GitHub Repository.
When I was first starting out in web development, I heard people talk a lot about this "cash" system that you could use in websites and your code, but I honestly didn't know too much about it. I eventually figured out it was a way to store data so you could access it faster, but I still thought it was some complicated architecture or a special system, but I was really just over thinking it.
A Cache is simply a place in memory where you can store data that is accessed often. In the context of JavaScript, this can simply be a global variable that you put the results of an AJAX call into. Then you create your function to look in that first, and if something is there, return that, otherwise make the original call you wanted to. There are a lot of libraries out there that will do that for you but we're going to build a very basic caching system, and walk through each of the parts.
I'm going to use NodeJS for this blog post, but this works just as well in the browser with an AJAX request or something similar.
First, I'll create a function that calls out to the github API and returns a value. I'll walk through the function and then add some caching to show how it's done. Below we have the basic script, which you can run with:
For this tutorial, I am using the npm library request for ease of use and am making a request to the GitHub API to query my repositories for the data about them. This is very similar to a request you would use in Express or another web framework and I have done things very similar for other projects. One thing to note is that the request library lets you specify the options easily for the HTTP request and the GitHub API requires a User-Agent in the header, so that's why that's there. If you don't GitHub will return an error and reject your request.
Next on line 15 I created the function to make the request. For this tutorial I have a bunch of logs and start and end times to track the time in milliseconds the entire request takes. So I set the start to +Date.now() which just converts Date.now to a number (in ms), and then logged the start time. The data in the body comes as a JSON string, so I parsed that and logged the name first item since it's an array of information.
Finally, there are some logs to output the ending time, and the total time elapsed for the request. On average I get roughly 400 ms on a good connection. And if this is inside a web request, you don't want to be adding half a second to a second to every request, on top of everything else the server is doing. To simulate this, the script nocache_multi.js has a few setTimeouts to repeatedly call the same function. As you can see to the right when you run it, each time you get a similar response time.
This script is the perfect location to add caching because it's not changing the request parameters and we can pretty much expect that the response will be the same every time at least most of the time. So instead of making the request every time the function is run, I'm going to add a storage object so that I can store the response and use that when the function is called again.
In the script below, you can see I've added a very basic cache named repo_cache on line 14, and added a result field to the object to store the data. In a bit you'll see why I split it into a separate field but for now, but you can see how it's being used below. On line 25 I added a check to see if we had any result data, if so, we simply log the results from that data and return, otherwise we continue with the original process. In addition I split out the logging into a separate function so we can call it from each path. The last change I made was that when I successfully get data, to store the parsed result in the repo_cache.result object.
When you run this function, you'll see that the first request takes some time, and then the next three are almost instantaneous. Here's what my output looked like to the right.
As you can see the first request duration was 440 ms, and the rest were zero because we had the data in memory.
So I successfully "cached" the response and had a much better response time thereafter. But we have a problem with this. This kind of cache isn't very useful for a couple reasons. First is that the data is going to get stale after a while and if the web server stays running for a while the data will be inaccurate, so there needs to be some kind of way to invalidate the cache, or turn it back off. Well that's pretty easy to do, we just need to store a timestamp of when we generated the data for the cache as well as a timeout and then if the timestamp + the timeout is less than the current time, we make a new request and refresh the cache.
Here is a snippet of the script cache_advanced.js from the GitHub repo below:
The changes I made were adding a last_updated and timeout field to the cache object, as well as checking those 2 during the cache check, and updating the last_updated field after the request.
I got the following results to the right when I ran the script. Since the timeout was set to 1 second and the time between requests was 1 second, I was able to cache the result for a single request, and then it was invalidated and refreshed and then accessed for the final function call.
So that's at a VERY basic level what caching is. There are a lot of things you can do to improve it and learn more about caching like: Using Function names and parameters to create a hash to store the results in and what not. This function was just a quick and dirty way to create a cache for a script. So hopefully if you're new to web development, that cleared up caching a bit and makes it a little easier to understand.
Feel free to download the Github Repository and grab all the scripts here:
https://github.com/WakeskaterX/Blog/tree/master/2016_04_Caching
When I was first starting out in web development, I heard people talk a lot about this "cash" system that you could use in websites and your code, but I honestly didn't know too much about it. I eventually figured out it was a way to store data so you could access it faster, but I still thought it was some complicated architecture or a special system, but I was really just over thinking it.
A Cache is simply a place in memory where you can store data that is accessed often. In the context of JavaScript, this can simply be a global variable that you put the results of an AJAX call into. Then you create your function to look in that first, and if something is there, return that, otherwise make the original call you wanted to. There are a lot of libraries out there that will do that for you but we're going to build a very basic caching system, and walk through each of the parts.
I'm going to use NodeJS for this blog post, but this works just as well in the browser with an AJAX request or something similar.
First, I'll create a function that calls out to the github API and returns a value. I'll walk through the function and then add some caching to show how it's done. Below we have the basic script, which you can run with:
node nocache.js
For this tutorial, I am using the npm library request for ease of use and am making a request to the GitHub API to query my repositories for the data about them. This is very similar to a request you would use in Express or another web framework and I have done things very similar for other projects. One thing to note is that the request library lets you specify the options easily for the HTTP request and the GitHub API requires a User-Agent in the header, so that's why that's there. If you don't GitHub will return an error and reject your request.
Next on line 15 I created the function to make the request. For this tutorial I have a bunch of logs and start and end times to track the time in milliseconds the entire request takes. So I set the start to +Date.now() which just converts Date.now to a number (in ms), and then logged the start time. The data in the body comes as a JSON string, so I parsed that and logged the name first item since it's an array of information.
Finally, there are some logs to output the ending time, and the total time elapsed for the request. On average I get roughly 400 ms on a good connection. And if this is inside a web request, you don't want to be adding half a second to a second to every request, on top of everything else the server is doing. To simulate this, the script nocache_multi.js has a few setTimeouts to repeatedly call the same function. As you can see to the right when you run it, each time you get a similar response time.
This script is the perfect location to add caching because it's not changing the request parameters and we can pretty much expect that the response will be the same every time at least most of the time. So instead of making the request every time the function is run, I'm going to add a storage object so that I can store the response and use that when the function is called again.
In the script below, you can see I've added a very basic cache named repo_cache on line 14, and added a result field to the object to store the data. In a bit you'll see why I split it into a separate field but for now, but you can see how it's being used below. On line 25 I added a check to see if we had any result data, if so, we simply log the results from that data and return, otherwise we continue with the original process. In addition I split out the logging into a separate function so we can call it from each path. The last change I made was that when I successfully get data, to store the parsed result in the repo_cache.result object.
When you run this function, you'll see that the first request takes some time, and then the next three are almost instantaneous. Here's what my output looked like to the right.
As you can see the first request duration was 440 ms, and the rest were zero because we had the data in memory.
So I successfully "cached" the response and had a much better response time thereafter. But we have a problem with this. This kind of cache isn't very useful for a couple reasons. First is that the data is going to get stale after a while and if the web server stays running for a while the data will be inaccurate, so there needs to be some kind of way to invalidate the cache, or turn it back off. Well that's pretty easy to do, we just need to store a timestamp of when we generated the data for the cache as well as a timeout and then if the timestamp + the timeout is less than the current time, we make a new request and refresh the cache.
Here is a snippet of the script cache_advanced.js from the GitHub repo below:
The changes I made were adding a last_updated and timeout field to the cache object, as well as checking those 2 during the cache check, and updating the last_updated field after the request.
I got the following results to the right when I ran the script. Since the timeout was set to 1 second and the time between requests was 1 second, I was able to cache the result for a single request, and then it was invalidated and refreshed and then accessed for the final function call.
So that's at a VERY basic level what caching is. There are a lot of things you can do to improve it and learn more about caching like: Using Function names and parameters to create a hash to store the results in and what not. This function was just a quick and dirty way to create a cache for a script. So hopefully if you're new to web development, that cleared up caching a bit and makes it a little easier to understand.
Feel free to download the Github Repository and grab all the scripts here:
https://github.com/WakeskaterX/Blog/tree/master/2016_04_Caching
Tuesday, April 12, 2016
JavaScript - Keeping your code DRY with Function Currying
If you are like me when I first heard about currying, you were probably thinking, well that sounds delicious, but what the heck is it? Sadly, it is not the aromatic dish to the left, but it IS a super useful thing to know about in JavaScript, and really isn't that hard to understand and put into use once you get the hang of it.
There are a lot of really technical definitions and lots of great information out there about function currying, but if you haven't taken classes on algorithms or don't have a Math or CS degree, it can seem pretty confusing. Function Currying is creating a function that returns another function with parameters set from the first function. Ok, not so simple in words, but it's not that complex. For an exact definition of it feel free to read up on it at Wikipedia, but since it's easier to just look at some code and see how it works, let's just take a look and see exactly what it is.
On line 7 however, we have something that looks a bit different. This is an example of function currying. What is happening is that the function addCurried isn't returning a value, it's returning an entire function, and creating a scope, with Y set to the value used in the function. On line 15 you can see an example of it being used to create a function called add_5 with the Y value in the function returned set to 5. Then that function can be used, and it will work just like the add function, except that you only pass in the X value, because the Y is always set to 5. You can also call it like addCurried(5)(3) to get 8 as well since addCurried returns a function which can then be called with the parameter (3).
If you don't know a ton about scope and closures in JavaScript and you're thinking: How can this all be so?! Go read You Don't Know JS - Scopes and Closures. It will really take your understanding of JavaScript scopes to a new level (and it's free online). Function currying isn't that scary at all, it's just a way to create a function with "preset values" so that you can reuse parts of a function. This becomes extremely useful when building callback functions and helps keep your code DRY.
There are a lot of really technical definitions and lots of great information out there about function currying, but if you haven't taken classes on algorithms or don't have a Math or CS degree, it can seem pretty confusing. Function Currying is creating a function that returns another function with parameters set from the first function. Ok, not so simple in words, but it's not that complex. For an exact definition of it feel free to read up on it at Wikipedia, but since it's easier to just look at some code and see how it works, let's just take a look and see exactly what it is.
The Basic Example
To the right you can see a basic example that's often shown to demonstrate Functional Currying. On line 3 you can see a very basic add function, it takes 2 inputs, and returns them added together. That's simple, that makes sense, it works just like you'd expect it to.On line 7 however, we have something that looks a bit different. This is an example of function currying. What is happening is that the function addCurried isn't returning a value, it's returning an entire function, and creating a scope, with Y set to the value used in the function. On line 15 you can see an example of it being used to create a function called add_5 with the Y value in the function returned set to 5. Then that function can be used, and it will work just like the add function, except that you only pass in the X value, because the Y is always set to 5. You can also call it like addCurried(5)(3) to get 8 as well since addCurried returns a function which can then be called with the parameter (3).
If you don't know a ton about scope and closures in JavaScript and you're thinking: How can this all be so?! Go read You Don't Know JS - Scopes and Closures. It will really take your understanding of JavaScript scopes to a new level (and it's free online). Function currying isn't that scary at all, it's just a way to create a function with "preset values" so that you can reuse parts of a function. This becomes extremely useful when building callback functions and helps keep your code DRY.
Practical Usage
There are TONS of uses for function currying, but one that I often employ is using function currying to create middleware or callbacks that can be modified for multiple functions.
For example, say you're using Express and you want to authenticate a user token against different scopes for different endpoints. Middleware makes that super easy, but you could spend a lot of time with anonymous functions doing logic for each endpoint if you don't keep your code dry.
First let's take a look at a very basic section of code without function currying. In this section of code below, you can see that I made 2 routes, and in each route, I'm checking a req.decoded_token.scopes value against a list of scopes required for this endpoint. If the validation fails, the response redirects to another endpoint and doesn't allow the user to pass.
This works, but is messy and isn't very extensible, nor is it DRY at all. An easy way to make this code look a lot nicer, is to use function currying, or to be more precise, partial function currying. Since we have 2 values that change from endpoint to endpoint: the scopes to validate against, and the endpoint to redirect to, we can create a middleware creator to curry those values onto a common logic base.
With this second iteration of the code, you can see it looks a LOT nicer and certainly more DRY. The function createMiddleware takes 2 parameters, our scopes we want to validate against, and the endpoint we want to redirect to, and returns a function which will be used as the middleware function for our application endpoints.
Function Currying and Partial Currying can be extremely useful in JavaScript particularly when dealing with anonymous functions or callbacks, and I often employ it to build handlers and helpers that can create the anonymous functions I need while being flexible and letting me specify the parameters.
I hope these snippets of code were some examples of how you can use currying to keep your code DRY. It is a really powerful skill to learn in JavaScript and having a good handle on how to use it properly will allow you to save a lot of time and effort.
Friday, April 8, 2016
Home Sweet Home
My wife and I recently bought a house (first time home owners!) and it has been a whirlwind of busy. We went the fixer-upper route and bought an old home from the 50s, and pretty much ripped everything out and refinished the interior, doing much of the work ourselves. We're hitting the home stretch and will be able to move in after four months of part time renovations, in a couple weeks.
When we bought the house, it was quaint, with a tiny kitchen, wallpaper EVERYWHERE (like... 3 layers worth), linoleum flooring in the kitchen, and a family room with wood paneling. It was... pretty old fashioned as you can see:
The bones of the house were good though, and it was a home that was taken care of in a price range we could afford. It has a small 1 car garage and a large back yard, which is good, because we have a young, rambunctious dog, who very much enjoys running around.
So, about 3 and a half months ago, we set to tearing the place apart. Wallpaper removal, wall removal, floor removal, etc. Demolition took around 2 months of spending our nights and weekends at the house. We had a massive 40 yard dumpster in our driveway for a good month and a half of that too, which ended up leaving dents in the pavement.
One of the toughest parts was this room to the left. That wood paneling was covering up another layer of wood paneling, which was covering up.... house siding. Yup, this room was originally just a space between a standing garage and the original ranch house. But instead of removing the siding when they connected the two to expand the ranch house, they simply put a layer of wood veneer over it.
Here you can see the old wood siding that was under 2 layers of that wood paneling.
This was probably the toughest room to rip apart. We also spent a ton of time renovating the kitchen as well. The old kitchen was a tiny little kitchen with a bedroom next to it that you had to walk through to get to the rest of the house.
So we decided to remove the bedroom, extend the kitchen out another 8 feet or so, and extend the 1 bathroom to be less miniscule. Here are some progress photos:
When we bought the house, it was quaint, with a tiny kitchen, wallpaper EVERYWHERE (like... 3 layers worth), linoleum flooring in the kitchen, and a family room with wood paneling. It was... pretty old fashioned as you can see:
The bones of the house were good though, and it was a home that was taken care of in a price range we could afford. It has a small 1 car garage and a large back yard, which is good, because we have a young, rambunctious dog, who very much enjoys running around.
So, about 3 and a half months ago, we set to tearing the place apart. Wallpaper removal, wall removal, floor removal, etc. Demolition took around 2 months of spending our nights and weekends at the house. We had a massive 40 yard dumpster in our driveway for a good month and a half of that too, which ended up leaving dents in the pavement.
One of the toughest parts was this room to the left. That wood paneling was covering up another layer of wood paneling, which was covering up.... house siding. Yup, this room was originally just a space between a standing garage and the original ranch house. But instead of removing the siding when they connected the two to expand the ranch house, they simply put a layer of wood veneer over it.
Here you can see the old wood siding that was under 2 layers of that wood paneling.
This was probably the toughest room to rip apart. We also spent a ton of time renovating the kitchen as well. The old kitchen was a tiny little kitchen with a bedroom next to it that you had to walk through to get to the rest of the house.
So we decided to remove the bedroom, extend the kitchen out another 8 feet or so, and extend the 1 bathroom to be less miniscule. Here are some progress photos:
We took the old kitchen, bedroom and bathroom:
And ripped EVERYTHING out
And tore up the old linoleum to lay down dark slate tile (we flew my old man out to help some)
And installed a beautiful kitchen and bathroom! (I don't take nearly enough pictures to get the progress well)
With real tile, painted walls, and new doors and trim.
So that's been taking up ALL of my free time, and why I haven't blogged much lately. Also, my PC went kaput, so I need to buy some parts and a new hard drive and get it running again. But this has been an insanely busy first quarter and it will likely stay busy for the rest of the year. We have landscaping to do, the rest of the baseboard painting, installing a fence, tree removal... and we'll likely do most of it ourselves.
It's a blessing though, owning a home. It really is great to be able to partake in the American Dream, especially coming from a family where my parents have never owned a home, I'll be the first among my parents and siblings.
Honestly, I'm mostly looking forward to finishing up all the basic work so I can start integrating IoT stuff into the house. The Amazon Echo looks pretty nifty, it'd be nice to pick one up and see what it's capable of. Plus there are tons of new devices that people are inventing to connect to each other, it's pretty neat what's out there already.
So that's my home sweet home. More of a personal post, but it's really a fun adventure, despite all the bone grinding and back aching work that goes into it. (And I'm still under 30, why do I ache so much!)
Thanks for reading and hopefully I'll have some time to do some side projects once we move in and write about them. Cheers!
Monday, February 1, 2016
GGJ 2016 - Cockamimey!
This past weekend was the Global Game Jam, a 48 hour blitz of game development & creation. Participants can jam out to the yearly theme at one of hundreds of Jam sites from around the world. First time and aspiring game developers meld with the local game dev scene to try and to create a playable game within the time limit.
My Jam location this year was at the New England Institute of Technology, having started a tradition with some friends of mine to do a new site each year. It was a fun, exhausting, and brain melting time, but we were able to finish a pretty polished product within 48 hours.
The theme was "Ritual" and that first night we plopped down in a swimming pool in our hotel to do some aquatic brainstorming. Floating around and throwing out every idea we could think of related to Ritual, we finally settled on doing a game about "Mating Rituals". We knew we wanted to make a Gear VR game, so we settled on a game about Birds Dancing in order to win a mate. Thus Cockamimey was born!
Cockamimey is a Simon Says type of game where you have to watch a competing bird, and copy their head movements and add your own to the chain in order to win the affection of the mate bird. Players look around in VR and must watch carefully so they can mimic the moves. The game was fun to make, despite running into a few hiccups along the way. Originally we had wanted to do a multiplayer version of this (and we still might), but the first night I needed some additional software installed and the hotel internet was very slow, so I spent 4-5 hours just setting up my development environment before I could even get the Gear VR demo running.
But we did get it all set up, and Saturday was a blast of programming and development. The other 2 people in the group: Ben & Dawn Taylor, are both phenomenal artists and handled all of the modeling, lighting and effects, and promotional art which was perfect as I handled all of the programming and level design. One thing we did well was to keep the scope fairly small. You only have to make six successful moves in a row to win the game, and the demo is short, sweet and gets the idea across. And it's fun.
We had to keep the game small, and use as few effects as possible, as mobile VR can be rough on phones for rendering and heating. We'll have to keep most of the scenes small like this, which works because the game consists of only three main characters.
The whole weekend was a blast and it was good to get back into Game development a bit as it has been a while since I've made anything game related. If you have a Gear VR you can check out Cockamimey on the Google Play Store or at our Global Game Jam page, and if you check it out give us a holler and let us know what you think.
Subscribe to:
Posts (Atom)