What does it mean to strive for ENGINEERING EXCELLENCE? As your engineering organization grows, you want to encourage teams and individuals to develop and improve their technical proficiency and practices. Growth puts stress on your teams’ ability to deliver kick-ass code consistently. To help them improve and grow, we built the LEVEL UP model to represent everything that’s expected of an excellent engineering team.
In this talk, I’ll review why we built this model, refined it and how we got teams to adopt it and the effects that it had on our department. I will also give you the tools to either use our model or fork your own specific one for your organization.
So you’ve launched your first service and all your customers in the US love it - great! Now comes the hard part - how do you make the entire world love it too.
In this talk, I’m going to share what from my experience building the PayPal Mobile app and expanding WeWork’s systems to span the globe.I’m going to cover it all: from encodings, translations, UGC, CATs, i18n, g11n, l10n, LQA, money, dates, names, and taxes. So that you can expand your app and reach new customers in London, Beijing, Tel-Aviv, and Paris.
viewDidLoad
methods in your ViewControllers and move them to where they belong.
So let’s start by defining the issue - a lot of times when initializing a view controller from a storyboard you have to add some custom code to initialize something that is either hard or impossible to do in the Interface builder.
Let’s say that we have a page with a simple TableView and we want to initialize that TableView with a content inset so that it never overlaps the icon on the bottom
Our ViewController in Xcode’s Interface builder
Regularly we would have done that extra setup in our viewDidLoad
method in the ViewController.
class MyViewController: UIViewController {
@IBOutlet weak var tableView: UITableView!
func viewDidLoad() {
super.viewDidLoad()
tableView.contentInset = UIEdgeInsets(top: 0, left: 0, bottom: 100, right: 0)
}
}
This is a great way to do things like that - but when you have a complicated ViewController your viewDidLoad
method can get really messy and clogged up with lots of UI setup code.
In Swift we have this awesome feature called Property Observers that let’s us add an observer method to a property of our class, this method get’s called whenever the property gets modified. Specifically we use didSet
.
We can utilize this feature to run our setup code when the view object is assigned to it’s property outlet (which happens after the ViewController decodes the storyboard).
In this way we have all our UI setup code in its contextual place and our viewDidLoad
is free to manage the logic and not the UI :)
class MyViewController: UIViewController {
@IBOutlet weak var tableView: UITableView! {
didSet {
tableView.contentInset = UIEdgeInsets(top: 0, left: 0, bottom: 100, right: 0)
}
}
}
I’ve been really enjoying playing around with Swift for the last couple of months, I’ll be sharing some more tips as I find them :)
So be sure to follow me on twitter @yonbergman.
First I wanted to share a few of the reasons why I enjoy using Parse:
An actual walk on the beach
One of the most basic things for a developer’s sanity is not to have all the code in one huge file, a guideline much easier said than followed in Parce. While the Parse documentation is incredibly good, it lacks any direction as to how you can manage your cloud code nicely.
The solution is very simple, Parse offers a way to manage dependencies in a fashion similar to - CommonJS, with something that they call Modules.
For anyone who hasn’t worked with CommonJS or the likes, it is a simple JS module loader that lets you manage dependencies in JS.
Let’s take for example this simple app with a player model and one cloud function:
- cloud
|- models
| \- player.js
\- main.js
var Player = Parse.Object.extend("Player", {
}, {
find: function(id) {
var q = new Parse.Query(Player);
return q.get(id);
}
});
module.exports = Player;
var Player = require('cloud/models/player.js');
Parse.Cloud.define("gameOver", function(request, response){
Player.find(request.params.id).then(function(player){
// ... Do something with player
});
});
The two important lines to understand are the last line of player.js
where we expose our newly created Player class to whoever will depend on us,
and the first line of main.js
where we announce our dependency on the player file and receive the class that was exposed allowing us to use it.
That’s all there is to setting up file dependencies in Parse cloud code.
Parse cloud code doesn’t support coffee script out of the box, which is a shame because it makes writing JS so much more fun. To get around this issue, I came up with a nice little solution that also lets me re-use code between the Cloud and the Client.
My solution relies on Middleman - a great library that I’ve been using more and more lately. Middleman lets you generate static sites with ease (it’s written in Ruby and very similar to Jekyll).
To start combining middleman and parse, it’s important to understand what Parse expects to get and run. Parse expects two main parts for a static site, first a cloud/main.js
file as the entry point to the cloud code, and second, a public
folder containing static html, css, js of the site.
If we were to create an empty Middleman project, we would be almost halfway to getting a static site running on Parse :)
Middleman’s build directory is called build
by default and we can either reconfigure Middleman to output the static files that are compiled to public
or create a symbolic link from public
to build
(I prefer the symbolic link).
To create the link, all you need to do is run:
ln -s build public
and you can even commit this link into your git repo without filling git with junk.
You now have a static site generated using Middleman running on Parse (of course you also need to deploy the changes to Parse).
The next part is to get the cloud code running as well. We want to write it in Coffeescript and also potentially share it with the client-side JS we rendered with Middleman and served via the public
directory.
My solution is to have a file under the source/javascript
directory that will compile to be the final cloud/main.js
- source/javascripts
| - cloud
| \- gameOver.js.coffee
|
| - models
| \- player.s.coffee
|
| - all.js.coffee
\ - cloud.js.coffee
#= require_self
#= require_tree ./models
#= require ./cloud/gameOver.js.coffee
MyApp = {}
This file can use middleman’s Asset Pipeline (sprockets) to require any files that, at build time, will compile to the final javascript. Note that by structuring the source folder correctly we can share files between the js going to the client side (all.js
) and the js running on the Parse cloud (cloud.js
).
The last piece of the puzzle is to create another symbolic link, this time linking cloud/main.js
to build/javascripts/cloud.js
- again this is done by running the command
ln -s build/javascripts/cloud.js cloud/main.js
Now that we have a Middleman app ready and shared code between client and server and the cloud code all ready, deployments of a new version require this simple two-step process.
> middleman build // compile the latest version of the site and code
> parse deploy // deploy the newly compiled code to Parse's servers
We just need to make sure you configure your Parse app to server static content and we’re good to go :)
A site developed in Middleman & Parse
I really hope these two pointers find their way to helping people building cool things, let me know if this helped you or if you have any follow-up questions.
You can always find me on twitter @yonbergman.
]]>People mesmerized to ‘Too Many Cooks’
For those of you who haven’t heard about it - the Global Day of Coderetreat is a day where 150+ cities around the world, from Australia to Hawaii, all host a coderetreat.
Coderetreats are events where developers come together to hone their craft and work on bettering themselves and their coding community.
In a coderetreat we usually have 6 sessions throughout the day where you solve Conway’s Game of Life each time paired with a different partner and set to different constraints.
This year me and my co-host @avivby wanted to try something new. For the past 3 years we have a routine that more or less works, with the order and timing of the different sessions.
We had a small issue in past years that after lunch people get tired - to try and combat that we moved our best session (the evil mute programmer) to be just after lunch. This helped greatly but created a new issue - now people where getting unfocused after this 4th session.
We were talking before this year’s event and trying to come up with a new and exciting session that we can add during the day. I remembered that in the last GOGARUCO I was exposed to a different code kata called the The Gilded Rose, I really liked the idea of having a kata where you work on existing code that simulates the pain of working on a real legacy system.
We decided that we want to have a sessions like that in this years’ coderetreat, we discussed where the source/starting code will be from and though what better than from the evil mute programmer session.
So in this year’s coderetreat at the end of the 4th session (Evil mute programmers) we asked everyone instead of deleting the code to zip the code and tests and to upload them to a mini-site that we set up for them.
After the retrospective for the 4th session we kicked of the 5th session by welcoming everyone to their new job at Initech and assuring them that the legacy system they’re going to be working on for the next session was built by our best people :)
In this 5th session they would be receiving a random zip file containing the code from one of the other pairs in the previous session and they had to complete the code while fitting the code to a new constraint (they now have to support a sized infinite world.) This session was a blast, almost as fun as the evil mute programmer one - people had the chance to receive code that had tests (even though they they didn’t always pass) and have to adapt someone else’s code. It really helped keep the excitement from the 4th session alive and had people very energetic all the way to the end of the coderetreat.
The mini-site is very simple, it’s a single page application written with Backbone, Coffeescript, SASS, HAML & Parse.com.
It isn’t the nicest piece of code but I built it in a couple of hours and it works.
I chose to use Parse as a database/backend mostly because I wanted to try it out and it was a delightful experience. It was super easy to get going with Parse as a simple data storage and even easier to integrate it with Backbone.
It was a long time since I’ve done Backbone without Marionette - and I really can’t see how one can do that in a real project. Marionette just solves so much of the cruft that comes out of working with Backbone.
I built the mini-site as part of the already existing site I had for the Israeli coderetreat events so it was based on a very simple Sinatra server that compiles the HAML/SASS/Coffee - but you can probably do without it and deploy to S3 or anyplace that accepts static sites.
I extracted the code so you can easily set up your own server running on Heroku - just clone the repo and follow the instructions.
If you haven’t participated in a coderetreat yet, look for the next one being held close to you or organize one by yourself. It’s an exciting event that brings back energy and passion into coding.
You can always find me on twitter @yonbergman.
]]>Last week we had the 3rd Israeli Rails conference in Tel-Aviv and I had fun meeting all the guests coming to give talks and share a drink with them.
I also gave a more technical talk about splitting a Rails app into several apps and gems and gave a case study for an app I got to work on.
Coming soon
]]>I want to thank the organizers @wifelette @joshsusser @purp & @sarahmei for organizing an aswesome conference.
I was sad to hear that this was the last GoGaRuCo - but was honored to not only attend for the third time but to get to speak at the last GoGaRuCo.
A big shout out goes out to all the other attendees of the conference and especially the other speakers - you were all awesome and super nice.
In my talk I shared the story of how I built and designed my first board game, how ruby helped me hone and optimize the game mechanics and parameters.
I started with a short intro to board gaming right now, what goes into making a game and I continued with my experiences while building Missiles & Microchips.
+10 awesome points to @jessabean who not only gave an awesome talk on Sketch Noting but sketch noted several talks including mine ;)
]]>respond_to
and .to_json
.
You have many options, whether it’s JBuilder, ActiveModel::Serializer, Grape, or Rabl.
</img>
I found that ActiveModel::Serializer works in a way that best fits me, both from functionality and style perspectives.
The issue with these libraries is that they only handle the creation of JSON representations of your data objects in a way that you can pass out, but what about the other way around?
When writing an API, you will encounter the need to parse data sent to you from JSON format to something that fits your object model.
I found that using Hashie is the best tool for this job. It fits the style I was looking for and mirrors using ActiveModel::Serializer on the serializing end.
Hashie is awesome because it comes with a lot of things out the gate that helps you parse nested data objects.
Hashie has verification and transformation capabilities which are a must when working with data passed into your system.
Hashie also provides a very clean output that both works as a Hash (with indifferent access) object[:property]
and as an object that you can access like a regular class object.property
.
Here’s an example of parsing a request sent to an imaginary API that books tickets:
module Api
class TicketBooker
def self.book_ticket(request_body)
parsed_request = new Request(request_body)
# ... more work
end
end
# Incoming JSON format
# {
# 'user_id': 13,
# 'booking': {
# 'show_id': 37,
# 'price': 75.99,
# 'seats': 2,
# 'preorder': true,
# }
# }
class Request < Hashie::Trash
property :user_id, required: true
property :booking,
with: -> (hash) { Request::Booking.new(hash.symbolize_keys) }
class Booking < Hashie::Trash
property :show_id, required: true
property :price_cents, from: :price, with: -> (float) { (float * 100).to_i }
property :number_of_seats, from: :seats
property :preorder
end
end
This code accepts a JSON and parses the data out into the format that the rest of the system accepts. You can nest as many of these as you want, as we use the Hashie::Trash’s transformation block to pass the parsing to a deeper level of the nesting.
It also handles exceptions really well - if a required parameter is missing an ArgumentError
will be raised.
Similarly, if the user passes any unexpected parameter a NoMethodError
will be raised.
You could potentially allow passing of unknown parameters by including a module provided in hashie.
class Request < Hashie::Trash
include Hashie::Extensions::IgnoreUndeclared
end
The Hashie::Trash object created is really flexible and you can use it as a clean object and access properties in a very nice way, or as a Hash which you can use it normally.
Potentially you could transform the inbounding request to fit a Rails’ nested_attributes chain so that you could validate the data and pass it straight to a Rails’ model .create
method which is really cool - just note that if you do that and use Rails 4+ or strong parameters, you’ll need to use Hashie::Rails which bypasses an issue with :permitted?
on Hashie objects.
# using a Hashie object
parsed_request = new Request(request_body)
parsed_request.user_id # 13
parsed_request[:user_id] # 13
parsed_request.booking.price_cents # 7599
parsed_request.booking[:price_cents] # 7599
parsed_request.booking.preorder? # true
parsed_request.merge!({foo: :bar}) # {...}
You can also use Hashie to parse responses when you call an API.
It’s super useful especially when working with APIs that have a very standard way to respond.
I first saw this used in an old version of Octokit where they used Faraday to mashify the responses from the GitHub API.
Hashie::Mash is a more loose version of Hashie::Trash that doesn’t support transformations or requires and is useful when you just want to generically wrap a Hash in a more robust interface. Hashie::Mash is also auto deep, meaning that any sub-hashes are also wrapped by the Mash.
When I started wrapping calls to other APIs in the server, I also used Hashie to parse the responses generated, although I used HTTParty and not Faraday to do the calls. (The code that wrapped the response was delegated to an HTTParty parser, but that’s a story for a different blog post.)
When working on the API, it was very surprising that not many people touched the subject of parsing requests and I hope this helps a few people down the path of building better, more stable APIs… until next time :)
You can always find me on twitter @yonbergman.
P.S. I’ll be speaking at this year’s Golden Gate Ruby Conference in San Francisco in September. It’d be cool to meet you so say hey if you see me walking around the conference hall.
]]>psql
and pg_upgrade
commands by checking that the following line appears in your .profile
, .bashrc
, .zshrc
(which ever you use) PATH="/Applications/Postgres.app/Contents/MacOS/bin:$PATH"
which psql
it should point to /Applications/Postgres.app/Contents/MacOS/bin/psql
- If it doesn’t check Step 0 again.psql
and connect to the DB.\l
to get a list of databases. You should have 4 databases: postgres, 2 templates and one database for your username.\c postgres
to move to the postgres database from your user’s database. This is necessarily since you can’t drop the database you are currently in (see next step).DROP DATABASE username;
(replace username with your username) to remove your user’s database.\q
to close psql.pg_upgrade --old-datadir ~/Library/Application\ Support/Postgres/var \
--new-datadir ~/Library/Containers/com.heroku.postgres/Data/Library/Application\ Support/Postgres/var \
--old-bindir ~/Desktop/Postgres.app/Contents/MacOS/bin \
--new-bindir /Applications/Postgres.app/Contents/MacOS/bin
Upgrade Complete
./analyze_new_cluster.sh
to get your DB ready for business(Optional Last Step) - Run ./delete_old_cluster.sh
to remove the old 9.2 data and delete the old Postgres.app from your desktop.
This is what I did whenever I had issues during the upgrade process and needed a hard-reset.
~/Library/Containers/com.heroku.postgres/Data/Library/Application\ Support/Postgres/var
directoryI’ve been looking a long time for a Postgres client for Mac that will be even a fraction of what Sequel Pro is for MySQL.
Well, I just found out about a pretty new client that shows a lot of potential and is really good (at least for my uses)
PG Commander you should definitely check it out.
Hope this helps a few people.
You can always find me on twitter @yonbergman.