AWS: CloudFront Static Caching

Hi, dear reader! Happy Independence Day from the Philippines! ❤

In this blog post, we will be looking into AWS CloudFront, AWS’s Content Delivery Network (CDN) service. This post is actually already long overdue and has been sitting in my drafts for about some time now.

One of the factors that could affect user experience when it comes to websites and applications is loading time. Have you encountered a site that you are very excited about, but unfortunately, its images and content take A LOT of time to load? And for every page, you wait for about 2 minutes or more for images to load? It can definitely take away excitement, doesn’t it?

This is a nightmare for both business and product owners as this could affect conversion and may increase bounce rates. To solve this, there are a lot of possible solutions, and on this post, we will be seeing how AWS CloudFront can be used as a caching mechanism for our static files.

In almost every website, static files exist, let it be images, CSS files, Javascript files, or whole static pages. And since these don’t change too frequently, we can just cache them so the next incoming requests won’t hit our server anymore (and could even be served faster as AWS CloudFront determines the nearest edge location to your customer).

AWS CloudFront accelerates content delivery by having many edge locations. It automatically determines the edge location that could deliver fastest to your edge customers. For a quick trivia, we actually have one edge location here in Manila. 🙂 CloudFront, too, has no additional cost, you only accrue cost everytime your content is accessed.

If you’ll be following this tutorial and creating your bucket, hope you can place it in the US Standard region, the endpoint for AWS S3. Based on my experience, having your new bucket in a different region may cause faulty redirects (i.e. temporarily routed to the wrong facility) in the beginning. And since we will be immediately experimenting with AWS CloudFront, these faulty redirects may be cached.

I. Creating S3 Bucket

AWS Cloudfront works seamlessly with AWS services like EC2 and S3 but also with servers outside of AWS. But for this quick example, we will be working with AWS Simple Storage Service (S3).


Screen Shot 2016-06-12 at 2.05.10 PM.png

II. Uploading Your File

Screen Shot 2016-06-12 at 2.02.43 PM.png

Also make sure that the file is viewable to everyone. before you access it via CloudFront. Else, the permission denied error message might be the one that will be cached.


Once you’re done giving permissions. Try accessing the image we just uploaded via the link on the upper part of the properties pane.

For our example, we have:


 Screen Shot 2016-06-12 at 2.08.16 PM.png


III. Creating AWS Cloudfront Distribution

We now go to CloudFront from our Services menu.

Screen Shot 2016-06-12 at 2.09.49 PM.png

Then we click the ‘Create Distribution’ button.


For our purposes, we will choose ‘Web’:


And choose the bucket that we just created a while ago as the origin:

Screen Shot 2016-06-12 at 8.40.15 PM.png

We can retain all other defaults for this example. If you wish to explore more on the other options, you may click on the information icon (filled circle with i) for more details on a specific option.

Once done, we just need to wait for our distribution to be deployed.

Screen Shot 2016-06-12 at 8.41.42 PM.png

IV. Accessing Your Static File via AWS Cloudfront

Once your CloudFront distribution’s status is DEPLOYED, you may now access your static file at the domain name specified by CloudFront.

AWS S3 Link:

AWS CloudFront Link:

We just replaced the S3 URL and bucket name with the assigned AWS CloudFront domain name, which is in our case.

V. Updating Your Static File

A. Deleting / Modifying Your Static File in AWS S3

Say we want to update the static file that we have cached with AWS CloudFront, modifying or deleting the static file in AWS S3 won’t make any changes to the file when accessed via the CloudFront URL unless cache has expired and a user has requested the specific file.

B. Versioning Your Static File

For purposes of updating or showing another version of your static file to the user, AWS recommends that users employ a way to distinguish different versions of your static files instead of naming them with the same name.

For example, if we have sample.png, for its version 2, we can have the name sample_2.png. Of course, this approach would require one to update the locations where he/she used the old links with the new updated links.

C. Invalidating A CloudFront Asset

If it is too tedious to change the occurrence of the old links, another method still exists: asset invalidation. AWS CloudFront allows the invalidation of assets even before cache expires to force edge locations to query your origin and get whatever is the latest version.

Note though that there is a limit to the number of assets that can be invalidated in a day.

To invalidate an asset, we choose the distribution we are interested in from the list of our existing CloudFront distributions:


Once chosen, we then click on the ‘Invalidations’ tab and click on ‘Create Validation’.


We then put the object path we want invalidated. This field also accepts wildcard elements so ‘/images/*’ is also valid. But for our purpose, since we only want sample.png to be invalidated, we put:

Screen Shot 2016-06-12 at 9.40.16 PM.png

Yay, we just need to wait for our invalidation request to be completed (~ 5 mins) and we may now access the same CloudFront URL to get the latest version of our static file.



So yay, that was a quick overview of AWS CloudFront as a caching mechanism in AWS. 🙂

Thanks for reading! ‘Til the next blog post! ❤


JS Weekly #1: Underscore, Lodash, Lazy, Apriori, and Grunt

Hi dear reader!

Hope you’re having a great June so far! 🙂 Welcome to this week’s dose of weekly JS!

For this week, we have:

  • Underscore
  • Lodash
  • Lazy
  • Apriori
  • Grunt

Day 1. Underscore.js

As a quick warmup for this series of Javascript adventure, I took on something more familiar for Day 1 which is Underscore.js. A JS library which we also saw in a previous blog post early this year: Underscore.js: Your Helpful Library.

Underscore.js provides a lot of functional programming helpers. It allows for easy manipulation of collections, arrays, objects, and even functions.

Screen Shot 2016-06-11 at 6.28.23 PM.png

For a quick application of Underscore.JS, we have a simple Text Analyzer that allows word frequency tracking and word highlighting with HTML, CSS, Underscore.js, and jQuery.

Screen Shot 2016-06-11 at 6.34.18 PM.png

For this application, we mostly used uniq, map, and reduce (which is very helpful!!!) functions, as well as Underscore templates.

Day 2. Lodash

For Day 2, we have Lodash, a JS library that is very similar to Underscore (in fact Lodash started as a fork of Underscore but was mostly rewritten underneath after).

Lodash presents a whole lot of functional programming helpers as well.

Screen Shot 2016-06-11 at 6.29.31 PM.png

To quickly use Lodash, we have a very simple application that allows the input of students’ name and eventually groups them into the specified input. This app uses HTML, CSS, jQuery, together with Lodash.

To make this application a little different from our Underscore app, this app focused on DOM manipulation (i.e. wrapInTD) in addition to text and data processing.

Screen Shot 2016-06-11 at 6.36.15 PM.png

Day 3. Lazy.js

Woot, 2 days down, we’re on Day 3, the game changer!

Day 3 has become a game changer for this JS series as this is the first time I used Node.js to quickly apply the JS library for the day. Starting out with Node.js, luckily, was not too difficult as npm install commands were already a little bit familiar from projects before.

Screen Shot 2016-06-11 at 6.29.49 PM.png

Lazy.js presents almost the same functionalities as Underscore but as its official site says, it’s lazier. So what does it mean to be lazier?

Recalling from Underscore, if we want to take 5 last names that start with ‘Victoria’, we do:

var results = _.chain(people)
 .filter(function(name) { return name.startsWith('Victoria'); })

But taking off from procedural code, the following seems to be lazier … and also faster. Why? – ‘Cause we already stop once we complete the length:5 requirement.

var results = [];
for (var i = 0; i < people.length; ++i) {
  var lastName = people[i].lastName;
  if (lastName.startsWith('Victoria')) {
    if (results.length === 5) {

And the way Lazy.js evaluates the following chain of code is along the lines of the above procedural code.

var result = Lazy(people)
 .filter(function(name) { return name.startsWith('Smith'); })

Inspired by one blog post I found on the web by Adam N England, I started the quick application with a section on benchmarking. From which, I also met for the first time another npm plugin, bench. bench is a JS utility that allows side by side comparison of functions’ performance.

Screen Shot 2016-06-11 at 6.38.05 PM.png

This application was a great learning experience as it also served as a playground for Node.js (i.e. requiring npm packages, own files, using exports, etc).


Moving on from benchmarking, in this app, we were also able to harness some of the capabilities of Lazy.js which includes indefinite sequence generation and asynchronous iteration.

Indefinite Sequence Generation Sample:

var summation = Lazy.generate(function() {
  var sum = 0, start = 1;

  return function() {
    sum += start;
    start += 1
    return sum; 

// undefined

// [1,3,6,10,15,21,28,36,45,55]



Day 4. Apriori.js

Yay, Day 4! My graduate class for this semester just ended and one of our final topics was on Unsupervised Mining methodologies which included Market Basket Analysis.

For work, we also have been looking into the Apriori algorithm for Rails as we already have it for R. Wanting to investigate the Apriori algorithm more, I tried looking for a JS plugin that implements it. And luckily, I found apriori.js!

Screen Shot 2016-06-11 at 6.28.46 PM.png

Documentation was quite limited so I learned to read the repository’s tests and also its main code to get to know the available functions.

For the quick app, we have a Market Basket Analyzer that outputs the associations found with their respective support and confidence values. Input includes the minimum support and minimum confidence but are optional.

Screen Shot 2016-06-11 at 6.40.45 PM.png

Day 5. Grunt

Woot! And finally for Day 5, we have Grunt! Being really new to Grunt, I started Day 5 by reading the book Automating with Grunt. Grunt is almost similar with Rake, a Ruby tool that we use to define and run tasks too.

Screen Shot 2016-06-11 at 6.30.10 PM.png

One of the quick applications I used Grunt with is a weather fetcher on openweathermap. This an example of multitask, a task that can have multiple outputs.

Running grunt tasks is easy, for example, to run the weather app we just need to do:

$ grunt weather


In one of the quick apps too, I was able to discover and incorporate a grunt plugin, grunt-available-tasks which makes viewing available tasks easier and more colorful (literally!)

So there! Yay, that’s it for Week 1 of this Days of JS project! ❤

Stay tuneeeed for more! 🙂

Thanks, reader, and have a great week ahead!!!

Quick Notes: Running http-server with npm

Hi dear reader! 🙂

How’s your May going so far! 🙂 Hope everything is going as wonderful and exciting as you imagined your May to be! ❤

This blog post would just really be short and sweet as it is only a mini documentation on running http-server from the Node Package Manager (npm).

http-server is a quick way to have a web server run locally to serve your pages. For example, I found this useful months ago when I was playing with Angular.JS where I needed to have my Angular app run on a web server for it to run seamlessly.

I. Initializing repository with npm

To be able to install node packages locally, you can issue the following in your project’s root directory:

$ npm init

After doing this, a file named package.json  will be generated. It will contain a list of the packages you have installed for your project and their corresponding versions, if applicable.

II. Installing http-server

$ npm install http-server

After installation, you will find a generated folder named node-modules where npm have installed http-server and where your future packages will also be saved.

III. Running http-server

$ ./node_modules/.bin/http-server

or better yet, you could add this location to your PATH environment variable:

$ export PATH=./node_modules/.bin:PATH

So you can just issue the command:

$ ./http-server

Take note that what we added to our PATH environment variable is a relative path so it applies even with other projects / directories.

Doing the export alone puts the node_modules directory in your path only temporarily. For this to persist, we can put it instead in bashrc.

$ PATH=./node_modules/.bin:PATH


So yay, there! 🙂 Thank you, reader! Wishing you a great week ahead! 🙂

Restoring MongoDB from a Dump File in AWS S3

Hi everyone!

It’s been a long time already since my last blog post! *cue music: “It’s been a long time without you, my friend!“* Haha. :))

Life has been pretty fast and busy lately wooo but fun nonetheless! I was just actually from a family vacation in Palawan and it was super nice! Clear waters, sunny skies, fresh air, yummy seafood, and crisp waves humming in one’s ear. All my favorite elements combined!

Woooo, so since today is #backtowork day, I started it with preparing a golden image for our QA database.

Backing up one of our databases wasn’t as tedious before (already completing after an hour or so). But due to some major changes in data collection and recording, one of our databases became huge which also made restoring take a while.

Due to this, preparing the testing database became one of the challenges during our last QA testing session.  I started restoring the database at 6 pm and it was still creating indices at 3 am. Because of this, I plan to just create a golden database image for QA testing regularly (maybe twice every month) and use it for QA testing sessions.

So there, sorry for the long introduction part for this post! So in this blog post, we’ll walk through the steps in creating a golden image for your MongoDB database, pulling your dump from AWS S3 and setting it up in your AWS EC2 instances. 🙂

My setup includes:

  • Mongo Database
  • Database Dump in S3
  • AWS EC2 Instances.

We can divide the whole process into 5 parts:

  1. Preparing the AWS EC2 Instance
  2. Copying the Dump from S3
  3. Mounting AWS EBS storage
  4. Preparing the Copied MongoDB Dump
  5. Restoring the Copied MongoDB Dump

Before we start, let us start with the following quote:

TMUX is always a great idea!

Oftentimes, we get disconnected from our SSH connections, and sometimes unfortunately, with a running process. Oftentimes too, we want to get back to whatever our workspace was – for this purpose, we can use tools, like tmux or GNU screen, that provides session management (along with other awesome feature like screen multiplexing, etc).

I. Preparing the AWS EC2 Instance

For the first part, we will be preparing the AWS EC2 instance where we will be running Mongo where we will be restoring our database to.

A. Provisioning the AWS EC2 Instance

For this, I used an Ubuntu 14.04 server,

Screen Shot 2016-05-13 at 9.25.20 AM.png

and provisioned with 72 GB for the main memory and an additional 100 GB with an EBS volume. These sizes may be too big or too small for your setup, feel free to change them to different numbers that would suit you best.

Screen Shot 2016-05-13 at 9.25.45 AM.png

B. Installing MongoDB

i. Import MongoDB public key
$ sudo apt-key adv --keyserver hkp:// --recv 7F0CEB10
ii. Generate a file with MongoDB reposityory URL
$ echo 'deb dist 10gen' | sudo tee /etc/apt/sources.list.d/mongodb.list
iii. Refresh and update packages
$ sudo apt-get update
iv. Install MongoDB
$ sudo apt-get install -y mongodb-org

C. Operating MongoDB

Here are some useful commands on operating MongoDB.

i. Starting Mongo:
$ sudo service mongod start
ii. Checking If It is Running:
$ tail -n 500 /var/log/mongodb/mongod.log

You should see something like:

[initandlisten] waiting for connections on port 27017
iii. Stopping Mongo
$ sudo service mongod stop
iv. Restarting Mongo
$ sudo service mongod restart

II. Copying the Dump from AWS S3

If your dump in S3 is publicly available, go ahead and use wget with the url that S3 provided for your file. But in case its security settings allows it to be only viewable from certain accounts, you can use AWS CLI to copy from S3

i. Install AWS CLI
$ sudo apt-get install awscli
ii. Configure you Credentials
$ aws configure
iii. Execute the Copy Command

* Feel free to change the region to the region where your bucket is

$ aws s3 cp s3://bucket-name/path/to/file/filename /desired/destination/path --region us-west-2


III. Mounting AWS BS Storage

From I, we have provisioned our Ec2 Instance with 100GB of EBS storage, now it’s time to mount it in our EC2 instance to make it usable.

We first want to see a summary of avaialble and used disk space in our file system:

$ df -h

We can see that our 100 GB is still not part of this summary. Listing all block devices with:

$ lsblk

We get:

Screen Shot 2016-05-13 at 9.29.11 AM.png

Since this is a new EBS volume, no file system is still intact so we proceed in creating a filesystem and also mounting the volume:

i. Check and Create File System
$ sudo file -s /dev/xvdb
$ sudo mkfs -t ext4 /dev/xvdb
ii. Create, Mount, Prepare Directory
$ sudo mkdir /data
$ sudo mount /dev/xvdb /data
$ cd /data
$ sudo chmod 777 .
$ sudo chown ubuntu:ubuntu -R .

For an in-depth tutorial on attaching EBS volumes, you may check my another blogpost: Amazon EBS: Detachable Persistent Data Storage.

IV. Preparing the Copied MongoDB Dump

Once you have downloaded your dump in S3, most likely it is compressed and zipped to save space. In that case, you need to uncompress it.

If your dump file has a .tar extension, you can untar it by:

$ tar -xvf /path/to/dump/dump-filename.tar

On the other hand, if your dump file has a .tar.gz extension, you can untar-gz it by:

$ tar xvzf /path/to/dump/dump-filename.tar.gz -C desired/destination/path/name

Continue un-tarring and unzipping your files if the main dump file contains nested compressed resources.

V. Restoring the Copied MongoDB Dump

$ export LC_ALL="en_US.UTF-8"
$ mongorestore --drop --host localhost --db db_name_here path/to/the/copied/dump/filename

If you are in tmux, in case you get disconnected, you can get back to your previous workspace by:

$ tmux attach


So there, a really quick and short tutorial on how we can get our Mongo Dumps and Databases up and running. 🙂

PostgreSQL 101: Getting Started! (Part 1)


An object-relational database system

I. Installation

A. Mac OSX:

brew install postgresql

B. Ubuntu

sudo apt-get update
sudo apt-get install postgresql postgresql-contrib

II. Console Commands

A. Connecting to PostgreSQL Server

To connect to the PostgreSQL server with as user postgres:

psql -U postgres

By default, psql connects to a PostgreSQL server running on localhost at port 5432. To connect to a different port and/or host. Add the -p and -h tag:

psql -U postgres -p 12345 -h

Once in, you may navigate via the following commands:

  • \l – list databases
  • \c – change databases
  • \d – list tables
  • \df – list functions
  • \df – list functions with definitions
  • \q – quit

III. Database Creation

CREATE DATABASE < database name >;

# Creates database with name: test_db

IV. Database Drop

DROP DATABASE < database name >;

 # Drops database with name: test_db

V. Table Creation

CREATE TABLE programs(

CREATE TABLE students(
  programid INTEGER REFERENCES programs,

A. Column Data Types


B. Common Added Options


VI. CRUD Operations

A. Insertion of Rows


INSERT INTO table_name(column1, column2, column3...)
VALUES(value1, value2, value3...);


INSERT INTO programs(degree, program)
VALUES('BS', 'Computer Science');

INSERT INTO programs(degree, program)
VALUES('BS', 'Business Administration and Accountancy');

INSERT INTO students(student_number, first_name, last_name, programid)
VALUES('2010-00031', 'Juan', 'Cruz', 1);

INSERT INTO students(student_number, first_name, last_name, programid)
VALUES('2010-00032', 'Pedro', 'Santos', 2);

B. Read/Lookup of Row

i. Get All Rows

SELECT * FROM students;

ii. Get Rows Satisfying Certain Conditions

# Gets row/s with studentid = 1

SELECT * FROM students where studentid = 1;

# Gets row/s where the last_name starts with 'cru' (non case sensitive)

SELECT * FROM students where last_name ilike 'cru%';

# Gets row/s where the student_number column is either 2010-0033, '2010-30011', or '2010-18415'

SELECT * FROM students where student_number in ('2010-00033', '2010-30011', '2010-18415');

iii. Get Specific Columns from Resulting Rows

# Selects the lastname and firstname from the students table

SELECT last_name, firstname from students;

# Selects the program column from rows of the programs table satisfying the condition and then prepending the given string

SELECT 'BUSINESS PROGRAM: ' || program from programs where program ilike '%business%';

C. Update of Row

i. Update all Rows

UPDATE students SET last_name = 'Cruz';

ii. Update Rows Satisfying Conditions

UPDATE students SET last_name = 'Santos' where studentid = 1;

UPDATE programs SET degree = 'BA' where programid NOT IN (2);

D. Deletion of Row

i. Delete all Rows

 DELETE FROM students

ii. Delete Rows Satisfying Conditions

DELETE FROM students WHERE studentid NOT IN (1,2)

VII. Queries

A. Joins

i. Inner Join

SELECT * FROM table_1 JOIN table_2 using (common_column_name);
SELECT student_number, program FROM students JOIN programs using (programid);

ii. Left Join

SELECT * FROM table_1 LEFT JOIN table_2 on table_1.column_name = table_2.column_name;

We insert a student row without a program

INSERT INTO students(student_number, first_name, last_name)
VALUES('2010-35007', 'Juana', 'Change');

Doing a left join would still return the recently inserted row but with empty Programs-related fields.

SELECT * FROM students LEFT join programs on students.programid = programs.programid;

iii. Right Join

SELECT * FROM table_1 RIGHT JOIN table_2 on table_1.column_name = table_2.column_name;

We insert a program row without any students attached

INSERT INTO programs(degree, program)
VALUES('BS', 'Information Technology');

Doing a right join would still return the recently inserted row but with empty Students-related fields.

SELECT * FROM students RIGHT join programs on students.programid = programs.programid;


Specify conditions by which rows from the query will be filtered.

SELECT * from students where programid IS NOT NULL;

C. Group By

Allows use of aggregate functions with the attributes provided to the GROUP BY clause as basis for aggregations

SELECT program, COUNT(*) FROM students
JOIN programs USING (programid) GROUP BY program;

Above example counts students per program.

D. Having

Similar to WHERE but applies the condition to the groups produced with GROUP BY.

SELECT program, COUNT(*) FROM students
JOIN programs USING (programid) GROUP BY program HAVING COUNT(*) > 1;

E. Union

Joins resulting datasets from multiple queries.

select * from students where programid in (1, 2)


select * from students;

MongoDB 101: A Starter Guide


Open source document database

I. Definitions

A. Document

  • Represent one record in MongoDB; consists of key-value pairs.

  • Similar to JSON Objects

  • Values may include other documents, arrays, or arrays of documents

         "_id" : ObjectId("54c955492b7c8eb21818bd09"),
         "student_number" : "2010-30010",
         "last_name" : "Dela Cruz",
         "first_name" : "Juan",
         "middle_name" : "Masipag",
         "address" : {
             "street" : "#35 Maharlika St.",
             "zipcode" : "30011",
             "city" : "Quezon City",
             "coord" : [ -73.9557413, 40.7720266 ]
         "gwa" : "1.75",
         "program" : "Computer Science",
         "degree" : "Bachelor of Science"

B. Collection

Documents are stored in collections. They are similar to tables but unlike a table, a collection does not require documents to have the same schema.

Documents stored in a collection has a unique identifier _id that acts as the primary key.

II. Installation

A. Mac OSx

brew install mongodb

B. Ubuntu

sudo apt-key adv --keyserver hkp:// --recv EA312927

echo "deb trusty/mongodb-org/3.2 multiverse" | sudo tee /etc/apt/sources.list.d/mongodb-org-3.2.list

sudo apt-get update

sudo apt-get install -y mongodb-org

III. Setup

A. Running the Database

By default, mongod looks for your database at /data/db


In case your database is at a different path, provide the –dbpath parameter

 mongod --dbpath .

B. Running the Console


By default, Mongo console would connect to localhost at port 27017. Make sure that mongod is running before you issue the mongo command.

C. Switching database

use mongo-cheatsheet
# switched to db mongo-cheatsheet; would be created if non existent

III. CRUD Operations

A. Insertion

     "student_number" : "2010-30010",
     "last_name" : "Dela Cruz",
     "first_name" : "Juan",
     "middle_name" : "Masipag",
     "address" : {
       "street" : "#35 Maharlika St.",
       "zipcode" : "30011",
       "city" : "Quezon City",
       "coord" : [ -73.9557413, 40.7720266 ]
     "gwa" : 1.75,
     "course" : "BS Computer Science"
 # => WriteResult({ "nInserted" : 1 })
 # _id is automatically asssigned

B. Read or Lookup

# Find student with student_number 2010-30010
db.students.find( { "student_number": "2010-30010" } )

# Find student with zipcode (embedded attribute) 30011
db.students.find( { "address.zipcode": "30011" } )

# Find students with the gwa column greater than 1.25
db.students.find( { "gwa": { $gt: 1.25 } } )

# Find students with the gwa column less than 1.25 and course is BS Computer Science
db.students.find( { "gwa": { $lt: 1.25 } , "course": "BS Computer Science"} )

# Find students with the gwa column less than 1.25 or course is Computer Science
db.students.find( { $or: [{ "gwa": { $lt: 1.25 } } , {"course": "BS Computer Science"}]})

C. Update

i. Update Attribute/s

Updates first matching document with first_name: Juan

    { "first_name" : "Juan" },
        $set: { "first_name": "Juana" }

The following code snipper updates the first matching document with first name Juan. And also sets the field lastModified to true (since it is non existent on the first run based on our schema, it will be created and set.)

    { "first_name" : "Juan" },
        $set: { "first_name": "Juana" },
        $currentDate: { "lastModified": true }

ii. Update Embedded Fields

    { "first_name" : "Juana" },
    { $set: { "address.street": "#45 Maginhawa St.",
              "": "Quezon City" }}

iii. Updating All Matching Documents

By default, update only updates the first matching document. To tell MongoDB to update all matching, we pass multi: true

  { "first_name" : "Juan" },
  { $set: { "address.street": "East 31st Street" } },
  { multi: true}
# WriteResult({ "nMatched" : 2, "nUpserted" : 0, "nModified" : 2 })

iv. Replace

    { "first_name" : "Juan" },
        "first_name" : "Victoria",
        "address" : {
            "street" : "Emerson Subdivision",
            "city" : "Saog, Marilao"}

If you want to insert in case the data is non-existent, pass upsert: true as well to the update call.

{ upsert: true }

D. Delete

i. Delete a document

db.students.remove( { "first_name": "Juan" } )
# removes all document

db.students.remove( { "first_name": "Victoria" }, { justOne: true } )
# removes only one of the matching document     

ii. Drop a Collection

# => true

IV. Query

In addition to simple lookup commands, you can also use aggregation:

 { $group: { "_id": "$", "count": { $sum: 1 } } }

Other available operators:

  • sort
  • project
  • and many more…

V. Data Import

A. Import from JSON, CSV, TSV

To import dataset from a JSON, CSV, or TSV

mongoimport --db mongo-cheatsheet --collection students --drop --file primer-dataset.json


  • Database name: mongo-cheatsheet
  • Collection name: students
  • Source File: primer-dataset.json

By default, connects to localhost:27017. If you wish to connect to other ports, add flags: –host and –port

mongoimport --db mongo-cheatsheet --collection students --drop --file primer-dataset.json --host --port 27019

B. Restore from Mongo Backup

To restore from a mongoDB backup:

mongorestore --drop --db mongo-cheatsheet /path/to/your/dump


  • Database name: mongo-cheatsheet
  • Dump path: /path/to/your/dump

C. Backup a Mongo Database

mongodump --db mongo-cheatsheet
  • Database name: mongo-cheatsheet

Underscore.JS: Your Helpful Library

Hey reader! First of all a happy happy new year! 🎉 Wishing you happiness, prosperity, good health, and success in the year ahead! \:D/

For this blog post,  we’ll be looking into Underscore.JS. As described in the Underscore.JS page,  Underscore is

“a JavaScript library that provides a whole mess of useful functional programming helpers without extending any built-in objects.”.

In a bird’s eye view, functional programming is the use of functions to transform values to other structures or into units of abstraction (and indeed Underscore provides a battalion of functions that you can use as is or use inside other functions).

Most blog posts say that if you miss the usual functional methods (available in most functional programming languages like Ruby) in your Javascript code then, Underscore is for you!

Screen Shot 2016-01-07 at 5.06.40 PM.png

Underscore.JS Website

Looking through the available functions and sample code, indeed, it’s true! Underscore provides a wide array of functions including pluck, map, and reduce which we see (and use) most of the time with our server code and other functional programming languages. With these, it is much easier to manipulate and transform data even on the client side.

To see a quick sample:

Let’s say we have this array:

var ratings = [80, 90, 99, 98, 100, 60, 50, 70];

And we want to add 5 to each of the array elements. With pure JavaScript, we do this:

for(i = 0; i < ratings.length; i++) {
   ratings[i] = ratings[i] + 5;

But with Underscore on, we can do a simple one-liner:

ratings =, function(rating) { return rating + 5;});

Tadaaaa! Sweet and simple isn’t it? So now let’s get it going with Underscore.js.

Including Underscore.JS in our File

To utilize Underscore, we need to first load it in our page. There are many methods to achieve this depending on your need, you can have bower, npm (Node Package Manager), Require.JS, among others. For this blog post, I downloaded underscore.js and included it in our page.


We can check that is indeed loaded, by typing “_” in our console:

Screen Shot 2016-01-07 at 7.05.29 PM.pngNow that we have Underscore is loaded, we can start coding either by typing in the browser console or adding JavaScript code to in our html file.


Getting Started with Array, Collections, and Objects

As you may have observed, all of Underscore’s functions can be invoked on “_” (but we could also change this as we’ll see later on :D)

On the left side of Underscore’s page, you can see the multitude of functions available in the different categories of Collections, Array, Functions, Objects, Utility, and Chaining.

In this blog post, we’ll see a few examples from those categories and we’ll see how we can combine them to form more complex queries.

For the examples below, this is the dataset that we will be working on:
Screen Shot 2016-01-07 at 7.22.37 PM.png


We can use _.each to iterate over an objects (that can be enumerated)

For example we want print the items and the category where they belong:

_.each(grocery, function(item) {
 console.log( + " belongs to " + item.category)

Screen Shot 2016-01-07 at 7.30.02 PM.png


Similar to Ruby’s pluck, we can use pluck when we want to extract a list of values for the given key (think of it like a shortcut to map when just getting the values).

For example, if we want to get all the names of the items in our grocery list, we could do:

_.pluck(grocery, 'name')

Screen Shot 2016-01-07 at 7.30.40 PM.png_.filter

Returns the elements in the list that returns true for the given condition.

Say we want to get all grocery items with price less than 50, we can have:

_.filter(grocery, function(item) { return item.price < 50; })

* _.filter is now the preferred name for as mentioned in the official documentation but using still works so far (you might see this often in other Underscore.JS blogs and tutorial). 🙂

Screen Shot 2016-01-07 at 7.33.50 PM.png


In case there are some undefined properties, _.defaults fills them up for you depending on the default values that you provide.

grocery =, function(item) {
 return _.defaults(item, {name: 'Unnamed', category: 'Uncategorized', subcategory: 'Unsubcategorized'});

Screen Shot 2016-01-07 at 7.39.30 PM.png

On our remote controlled car object (last item in the list), we can see that after we ran _.defaults, category and subcategory were filled in and were set to Uncategorized and Unsubcategorized repectively.



In an object’s sea of attributes, when you only want certain attributes, you can use _.pick.

Let’s say we want our array grocery to have objects that only include name and category, we can do:

console.log(, function(item) {
 return _.pick(item, 'name', 'category');

Screen Shot 2016-01-07 at 7.46.32 PM.png


The reduce function allow us to compress a list to a single value.

For example, we want to add all the prices of our items. In our callback, we first pass acc, our accumulator and item, the variable upon which the elements of the list will be assigned to in the loop.

_.reduce(grocery, function(acc, item) {
 return acc + item.price
 }, 0);

Screen Shot 2016-01-07 at 7.47.22 PM.png


_.groupBy groups the list of a list into categories and returns an object with the categories as keys and with arrays containing the items that fit into that certain category.

_.groupBy(grocery, function(item) { return item.category })

Screen Shot 2016-01-07 at 7.50.24 PM.png


_.countBy on the other hand, groups a list in categories that you can specify and returns how many belonged to the specific category.

Here, we counted how many belonged to the different price ranges, 1-30, 31-60, 60-100, and 100 and above.

_.countBy(grocery, function(item) {
   if (item.price < 30) return '1 - 30';
   if (item.price < 60) return '31-60';
   if (item.price < 100) return '60-100';
   return '> 100';

Screen Shot 2016-01-07 at 7.48.00 PM.png


_.range, on the other hand, is a handy function that gives the values within the given range (start inclusive, end exclusive) by increments of the given third parameter., 12, 1), function(day) { return 'Day ' + day + ' of Christmas!'});

Screen Shot 2016-01-07 at 7.48.51 PM.png

Doing _.range(0, 10, 2), on the other hand, gives 0, 2, 4, 6, 8. And having a negative incrementor like _.range(99, 0, -1) gives the numbers from 99 to 1 in descending order.


Continuing with Function and other Utility Functions


Another handy function from Underscore is _.once, it allows the wrapped function to be ran only once. Succeeding calls to the wrapped function just returns the given previous value.

generateRaffleCoupon = function() {
 // Assigns only one unique raffle number per user. 
 var points = _.random(10);
 return points
a = _.once(generateUniqueId)

Screen Shot 2016-01-07 at 11.31.09 PM.png


In the above example, only one raffle number can be assigned to a user and succeeding calls to our simple raffle number generator just gives the already assigned number.

The original unwrapped function, generateUniqueId(), on the other hand, just behaves normally and can gives different values for different calls as we can see in the last two calls.


_.times invokes the given function for the given number of times.

_.times(5, function(i) { console.log('hi ' + i)});

Screen Shot 2016-01-07 at 7.52.04 PM.png


Method Chaining

Above, we saw a lot of handy functions that you should let you go up and running (but of course there are still so much more that you should also check out the official documentation :D)

As we mentioned above, functional programming involves the use of functions to accomplish a certain task, either to transform objects or to simplify and reduce an object or list to a single value. And in accomplishing a task, we may use not only one, but a series of functions to get our desired result. For this purpose, we can do method chaining.

Calling _.chain returns wrapped objects so we can only get the value until we call  .value() on our chain.

console.log(_.chain(grocery).map(function(item) {
 return item.category + ' Product';

Screen Shot 2016-01-07 at 7.51.01 PM.png

Underscore.JS as a Templating Engine

Say we have a dynamic page that should say “Hello <name of user here>“. We need to have our html code evaluate a dynamic value that will be passed to it. For this purpose, underscore’s _.template comes handy.

So how do we do it?

First, let us declare our template and enclose the dynamic values with the default interpolation delimiter of Underscore <%= %>.

var template = 'Hello <%= name %>! Welcome to this site!';

And then let’s compile it with _.template,

var finalTemplate = _.template(template);

which returns as with a function that we can call.

console.log(finalTemplate({name: 'John Doe'}));

Screen Shot 2016-01-08 at 10.05.41 AM.png


We now have our html code with the name variable replaced by the given parameter. To use this, what we can do is just assigned result ot html (for ease of use, I also used jQuery for DOM manipulation):

result = finalTemplate({name: 'John Doe'});


Other Configurations

As mentioned above, there are several customizations that you can do with Underscore and here are some of them:

I. Changing _

We can change _ to other characters or set of characters that you like (maybe you want to do this if you have other use for the underscore character like storage for the last value), you can do the following:

var uscore = _.noConflict();

So now, we can use uscore instead of _

events = ['New Year', 'Graduation', 'Birthday', 'Anniversary'];
console.log(uscore.each(events, function(event) { console.log('Happy ' + event  + '!!!')}));

Screen Shot 2016-01-07 at 5.01.03 PM.png

But bear in mind that when you do this, _ no longer is usable, unless you re-assign. 

Screen Shot 2016-01-07 at 5.01.49 PM.png

II. Object Orientedness

scores = {'Bob': 1, 'Alice': 4, 'Mallory': 3};
(scores).keys().map(function(p) { return "Hi " + p + "!"; })

When doing this object oriented style, of using Underscore.JS, you can immediately chain functions without using _.chain.

Screen Shot 2016-01-07 at 4.58.21 PM.png

III. Customized Template Delimiters

In case we want to change the delimeters from being <% %> to something else of our liking, we can do so by proving custom interpolation and evaluation regular expressions parameters to _.templateSettings.

In this example, we changed <%= %> to {{ }}

_.templateSettings = {
 interpolate: /\{\{(.+?)\}\}/g,
 evaluate: /\<\%(.+?)\%\>/g

Screen Shot 2016-01-07 at 8.15.59 PM.png

Sample application

So now, we try to build a sample application that uses Underscore.JS for data processing as well as a templating engine.

Say we have a product inventory, where our list of grocery items is displayed plus the user can also add more items.

Screen Shot 2016-01-07 at 5.48.35 PM.png

The full source code can be found in this gist.

In implementing this, we used the following:


  1. Use of Defaults

    We allow empty categories and subcategories in our form so we need to fill them up with defagrocery =,

    function(item) {
       return _.defaults(item, {name: 'Unnamed', category: 'Uncategorized', subcategory: 'Unsubcategorized'});

    Screen Shot 2016-01-07 at 5.50.14 PM.png

  2. Use of Templates

    We also used templates to display dynamically our table. We call the updateView() function when the page loads and every time an item is added to our grocery list. Screen Shot 2016-01-07 at 5.50.37 PM.pngScreen Shot 2016-01-07 at 5.50.29 PM.png

  3. Customized Template Delimiters

    We changed the interpolation delimiters from <%= %> to {{ }} and retained the evaluation delimiters as <% %>.

    Screen Shot 2016-01-08 at 10.14.55 AM.png

  4. Some Functional Programming

    We standardized our error and warning messages by creating the fail and warn methods which we can call anywhere accordingly.

    Screen Shot 2016-01-07 at 5.57.20 PM.png

    We also created a parser for our input which is called before we push our the new item to our grocery list. This parser calls our fail and warn functions.

    Screen Shot 2016-01-07 at 5.50.56 PM.png



There, as we saw, Underscore.JS is very helful and provides tons of functions that we can use to manipulate our data on the client side. With a mere 4 KB file size, it can already do wonders! Try it now! 🙂