Our geek experiments and experiences
Installing Ruby on Rails on a Mac can be a daunting task. From 10.6 to 10.9, I’ve been through all the issues. Starting with mac ports and all kinds of ruby installers. I’m going to explain how to setup the rails environment as easy as possible. We’re going to start with a clean Mavericks install, so we don’t have any conflicts with already installed software.
Step 1. Install Homebrew
Homebrew is a package manager for the Mac. Just like you have apt or yum on linux. It can install all sorts of applications and libraries. Start off with installing Homebrew by issuing the following command:
ruby -e "$(curl -fsSL https://raw.github.com/Homebrew/homebrew/go/install)"
To make sure Homebrew is installed correctly run:
Step 2. Install RVM
RVM stands for Ruby Version Manager. It can manage multiple ruby versions. This is especially handy when you have more than one applications to control. But it’s also handy to isolate ruby and it’s gems. Starting a new app with the latest version of ruby, no problem, while still running your older app simultaneously.
\curl -sSL https://get.rvm.io | bash -s stable
Reload your bash_profile, so it can initialise RVM.
Run RVM requirements to install basic required packages with Homebrew. Accept the license agreement and install Xcode Tools. After installing the tools continue the rvm requirements installation.
Step 3. Install latest Ruby
Install ruby, ruby-head is the latest version. You can also use 1.9.3 for example or even a ruby version with a different patchlevel. For ease of use we use ruby-head. You can call ‘rvm list known’ for all available rvm environments.
rvm install ruby-head
Optional. Install Git
Git a version control system, which can be used to control your code repository.
brew install git
Optional. Install MySQL
Install MySQL database and the client to connect to it. This used to be a tricky ordeal because the mysql2 gem needed specific locations for the files. Right now it should work out-the-box.
brew install mysql
Optional. Install MongoDB
MongoDB is very well known document database. Only install it when you know you need it. There are many alternatives such as Redis for key-value database.
brew install mongo sudo mkdir -p /data/db/ mongod # to start
Optional. Install PostgreSQL
Postgres is a more advanced SQL database. I think it’s becoming more default now since MySQL was taken over by Oracle. Decide which one you would want to use. Beginners can better start with MySQL.
brew install postgresql
That’s it after install all these packages you can start a new rails app by running:
rails new your_application_name
Rails shouldn’t be that hard to install anymore. Best of luck!
My working environment
Lastly, here is my Mac working environment. I use all of the applications on a daily basis:
In a previous post I mentioned that we bought a Yeti microphone to create tutorials with sound. However, for this first new tutorial movie we didn’t use Yeti, because we didn’t have it yet. For the next ones we’ll use it.
Hopefully you’ll be amazed by the sound quality then!
We got a new gadget at the Floorplanner office! It’s an USB microphone called Yeti.
The Yeti is one of the most advanced and versatile multi-pattern USB microphones available anywhere. Combining three capsules and four different pattern settings, the Yeti is an ultimate tool for creating amazing recordings, directly to your computer. With exceptional sound and performance, the Yeti can capture anything with a clarity & ease unheard of in a USB microphone.
Why do those guys need a microphone you might ask. Well, that’s because we are creating a new set of tutorial movies, and this time they will have sound! Yes they will have sound, unlike our silent-hit-movie with over 2.5M YouTube views.
Keep an eye on the Floorplanner help section for the new tutorial movies.
We are working for some time now on using WebGL to render 3D floorplans on the client and I think the time has come to share the first beta version with you.
Our new WebGL version has to improve two things over to our current Flash version; the floorplan has to look better and it has to load faster, a lot faster. We are not there yet, but I think we are making great progress.
For our new WebGL engine we use a technique called deffered rendering. If you are interested to learn more about it check out Florian Boesch’ post WebGL Deferred Irradiance Volumes and our Rendering transparency in a deferred pipeline.
A very nice feature of the new viewer is that it’s completely backwards compatible. So when we’ll launch the new WebGL viewer all the floorplans in our database - nearly 12 million and counting - will be instantly more beautiful and load a hell of a lot faster!
At Floorplanner we like Bitcoin and that’s why you will soon be able to purchase Roomstyler Credits by paying with Bitcoin.
We are now in the final stages of testing the new system, so buying and spending Roomstyler Credits is - at this moment - limited to a small group of users. If we don’t encounter big road blocks we should be able to release it within a couple of weeks.
Roomstyler Credits, why? Well, that’s because we are going to offer premium features and those premium features won’t be free. Think of rendering photo realistic interior images at a higher resolution or requesting custom 3D furniture models or materials. And this is just the start, many more premium features will follow (for example virtual panoramas).
Most of the premium features will cost a small amount of money - only a couple of Euros - and most people don’t like going through a payment system for only a small amount of money. That’s why we are introducing credits. Buy a couple of credits once and spend them - without any friction - on all kinds of different things.
So if you are looking for a new and creative way to spend your Bitcoins, you might consider becoming an interior designer at Roomstyler.
We did an experiment - see live demo here - so see if we could create a virtual panorama from an existing Roomstyler room. The room we used is called Modern Vintage Nursery and was made by ladyfakessi (btw. she won the Nursery Room contest with it).
For the panorama we created a standard cubestrip image by rendering several images and stitching them together. The viewer we used for this experiment is the krpano Panoramic Viewer.
The experiment showed us that it is possible to create panoramas from Roomstyler rooms. So potentially we can turn any room into an interactive panorama! But… it took very long to render the needed images at this resolution, over 3 hours. Hopefully we’ll find a way to substantially decreased this time by improving our render speed.
See live demo here.
Today, new rendering hardware was delivered to our office.Yay! We are going to use it to test a couple of different render engines. We want to improve the quality of the images, but we also want to create them faster, real time if possible. Quite an interesting challenge!
First things first, let’s get this puppy up and running.
You would expect some high-tech post about 21-st century tech, right? I too originally had planned of writing about the largest and bleediest edge multitouch tables known to man. But then something happened, and the meaning of tech in its essence is suddenly much more relevant to me.
from Greek τέχνη, techne, “art, skill, cunning of hand”; and -λογία, -logia The making, modification, usage, and knowledge of tools, machines, techniques, crafts, systems, and methods of organization, in order to solve a problem, improve a pre-existing solution to a problem, achieve a goal, handle an applied input/output relation or perform a specific function.
Actually the very first tech ever -controlled fire- is the subject of this post, I’ll tell you why:
How will a programmer survive the frozen wilderness (in a comfortable way)?
In a few days I’ll go on a long christmas holiday with a group of friends. We’re going to the north of Sweden to enjoy some ice fishing, northern light, sauna sessions and feasting. Since most of my friends are artists, we usually watch alot of obscure movies do a lot of drunken disco dancing and try to impress each other with 12 course organic meals. I tend to bring my laptop to do some hobby projects or some impulsive collaborations with my friends. I was also looking forward very much studying astronomic systems with Stellarium.
Anyway, nature has gotten in the way of those plans.
A few days ago the biggest storm of the century (named Ivar) has happened in Sweden, rendering the roads useless and electricity is down in the region we’re going. The road problem is not that hard; apparently we can park a mile away and use sleighs or snow scooters to bring our stuff over the frozen lake. The electricity problem is a bit more problematic. Off course we will bring a diesel generator, but to do some cooking (for 14 people) on an electrical stove connected to a generator sounds very crappy. Especially considering the meals are supposed to be grand and impressive.
We just really need some serious ovens for cooking. I haven’t got any real winter survival skills, I only ever helped building a pizza oven with clay and straw once. In the middle of summer. Clay and straw aren’t available at this time of year though. So I don’t think that knowledge is very helpful either.
The tools and materials I will have at my disposal :
- stones in all shapes and sizes
- frozen soil
- gasoline, oil, lighter
Many DIY wood ovens I find involve brick and mortar, We don’t have those, and even if we did I believe the freezing cold will prohibit mortar of setting. So that isn’t an option. Off course our forefathers managed some hardcore ice ages just fine without bricks, so there are alternatives.
- Mesolithic fire pit / pit oven from the stone age era
- Bloomery from the early iron age era, build for smelting iron, so it’ll sure be hot, I don’t know if you can get your food in/out though.
- Stone oven from the stone age era
- primitive mud oven
- survivalist fire pit like the ancient ones, but some other plastic bags tricks are described here too.
For now I cannot do more then print out and plastify those pages, I’ll test the theory in practice in a few days. I’ll report back at that time. (When I won’t freeze to death) So as a goodbye I’ll put this quote here:
"Beware of bugs in the above code; I have only proved it correct, not tried it." Donald Knuth.
So you want to render transparent objects in your deferred renderer? Good luck. To quickly reiterate why is transparency difficult in deferred rendering… wait, no, go and have a look at John Chapman’s article about it. It’s a good reading.
It really depends on what kind of transparent objects you have, but if you happen to have a simple scene, where you can afford manually Z-sorting your transparent objects, using their center points, then the technique outlined below might suffice. It’s really simple and performs well. I invented it for WebGL, but naturally, use it wherever you want, it’s pretty generic.
It really depends how your deferred pipeline works, how do you apply shading, lightning, direct and indirect illumination, but let’s make some assumptions:
You can divide your objects into two sets: opaque and transparent
You can run your shaders on different sets of objects and output to different textures
You have a normal/depth render pass that outputs world space normal of a fragment to RGB, and fragment’s depth to Alpha channel, of FBO’s attached float color texture
- You have an albedo pass that outputs fragment’s color and transparency
First, this is the image that we’re aiming for:
Transparency in the back, in between, in front, with opaque object in front of transparent one, full featured example, even with two transparent objects overlapping. To render this image, we need to first render normal/depth maps for opaque and transparent objects separately, it looks like this (note that normals and depth are in a single texture):
Opaque objects (normals)
Opaque objects (depth)
Transparent objects (depth)
Next we need albedo maps. These maps will tell us actual surface color/texture and in case of transparent objects, it’s alpha value. For the spheres in our example, the albedo is simple. We render it with depth testing and backface culling enabled and voila.
Opaque objects (albedo)
For the transparent objects though, it’s a bit more complicated. As mentioned above, you can’t use depth testing, instead, you need to manually depth sort those objects. That’s because we’re going to use alpha blending and and simply using depth buffer won’t do. If you can precalculate center points of those transparent objects, it’s easy. Then you just transform those points using view matrix to get them into camera space, and sort them by their Z coordinate. You will use this sorted set to draw those objects back to front. Now the trick is to render with depth testing disabled and with blending function set to GL_ONE, GL_ONE_MINUS_SRC_ALPHA. This means that the objects in front will add up to the color of the objects already drawn behind them. The result is this:
As it turns out, you can’t really use this image, because it contains pixels that are not visible in the final image. Those are the fragments that are occluded (they’re behind) by the opaque objects. To do this, we use the depth map of the opaque objects in this albedo shader, and we compare depths of the current (semi-transparent) fragment, and depth of the same fragment in the opaque objects’ depth map. If the opaque object’s depth is lower than the depth of this semi-transparent one, it means the opaque object is occluding this fragment, and we discard it, or we set it to zeros.
Transparent objects (albedo & alpha)
To composite the final albedo map, we simply use the alpha value from the transparent objects’ albedo map to blend between transparent and opaque albedo maps.
If you want to apply some shading and lighting (lambert, phong, you name it), you will have to do something very similar to the above, you will render it separately for transparent and opaque object and you will blend between those, using the values from depth map and alpha value from the albedo maps. Have fun blending.
In the process of developing our floorplan engine, I have encountered this subject. And I’d like to sort it out at one place of the mathematical calculation as well as the code of this subject for the folks who are interested to look up.
Let’s start from Bézier curve. As we know, a Bézier curve is defined by a set of control points P0 through Pn, where n is called its order, n = 1 for linear, 2 for quadratic, 3 for cubic, which can be expressed by function B(t) shown in the picture below. The first and last control points are always the end points of the curve. A linear Bézier curve is simply a straight line between those two points.
Next, a common problem is to find common point(s) of a line segment and a quadratic curve. So, the exact problem is
Given PA and PB defines a linear Bézier curve (line segment) L , while given points PC, PD, PE defines a quadratic Bézier curve C. The intersection(s) can be expressed as such, where t represents the position on curve. Show in the picture below.
First to look into regular cases, while the line segment is not an orthogonal line. We can express this problem in the following way.
Substitute P coordinate into quadratic curve function result
Further, it resolves into a quadratic equation in t
This quadratic equation can be rearranged into
So, the solution(s) of t is
As you can see the result of intersecting can have 3 possible scenarios. If ∆ > 0, then line L cross curve C twice resulting 2 intersections; if ∆ = 0, then line L cross curve C only once; if ∆ < 0, then there is no intersection to be found. In other word, the calculation of a quadratic equation can have no return.
Secondly, to handle the special cases - orthogonal lines. Given line is x = A, the quadratic equation in t is
Given y = B, the quadratic equation in t is
The last is to use returned t to get the corresponding points on the curve C using
Also a condition — limited — to toggle return of intersections fall on this curve or the extension of the curve. As in a case shown in the picture below, could return null if limited is true even ∆ > 0. And if you want you can also double check whether this point is on the line segment or its extension.
Here is the code in coffee style.
intersectBezier2WithLine = (C, L, limited = true) -> if orth = L.isOrthogonal() if orth a = C.A - 2 * C.cp + C.B b = -2 * (C.A - C.cp) c = C.A - L.A else a = C.A - 2 * C.cp + C.B b = -2 * (C.A - C.cp) c = C.A - L.A else k = (L.A - L.B) / (L.B - L.A) a = k * (C.A - 2 * C.cp + C.B) + C.A - 2 * C.cp + C.B b = -2 * (k * (C.A - C.cp) + C.A - C.cp) c = k * (C.A - L.A) + C.A - L.A if solutions = (if a is 0 then [-c/b] else solveQuadraticEquation a, b, c) C.getPoint t for t in solutions when not limited or 0 <= t <= 1
lerp = (degrees..., t) -> lerp2 = (a, b, t) -> a + t * (b - a) while degrees.length isnt 1 degrees = for i in [0..(degrees.length - 2)] lerp2 degrees[i], degrees[i+1], t degrees
solveQuadraticEquation: (a, b, c) -> return if a == 0 if (disc = Math.pow(b, 2) - 4 * a * c) >= 0 [(-b - Math.sqrt(disc)) / (2 * a), (-b + Math.sqrt(disc)) / (2 * a)]
getPoint = (C, t) -> [lerp(C.A, C.cp, C.B, t), lerp(C.A, C.cp, C.B, t)]