EmberJS Data is moving fast those days. Lot of refactoring is going on and the documentation is not following as fast as the code.
To help me to understand (sometime to reverse engineer) ember data I use dedicated Unit Tests written with QUnit. Each time I am exploring a new concept, I write a new small Qunit test to play with it on a toy case. When my understanding is fine, I put it on real code.
This helps me also ensure my understanding of ember data is still the same when I upgrade my version of the framework. If not, I can decide to not upgrade or to investigate what changed still my last understanding.
This approach is inspired by eXtreme Programming Spike. You can get more historical information on the c2 wiki.
If you are interested in test classifications, I would not call those tests as "unit" as they actually test thru the whole library, end-to-end. They are more like acceptance tests.
On the first example, I am exploring how Ember Data save models to
the backend, using a regular DS.RESTAdapter
. My backend is a full
custom Erlang based server (using
cowboy), and I dont know much
about RoR conventions. This test helps me understand how Ember send
data to the server, and what he is expecting as result. I would then
be able to implement the correct protocol on the server side.
I use Mockjax to simulate the server, allowing me to debug what is send as Json data, and put some assertion on it.
The most difficult part on this kind of test is to handle the
asynchronous behavior of Ember commit
which return without waiting
for answer from the server. The data will be populated some time
later, when the network send back the response. That's why you need to
use the timer trick to have QUnit be able to test correctly.
Designing this kind of testing pattern is not easy when you are a beginner. You have to dig into Ember Data unit tests to find inspiration.
I am now trying to store a nested model in the same commit
transaction. After some testing, it appears this is
not currently possible
to do it, and you can see people waiting for it. Someone has already
submitted a pull request
to address the issue. The
anwser
from Tom Dale let me think this is a quite complex one. I decide then
to go the easy/safe way and store/load the model as one Json nested
hieararchy.
Reading the current README
documentation, I discover we can add custom types with serialization
and deserialization of Json data using DS.attr.transforms
as they
say. Unfortunatly, I soon discover this is not working anymore with
the library version I am using (revision 8). Even
worst (or good?), this is a part of the library which is under
fast moving
refactoring.
Here come our spike/Qunit test to help!
After a lot of google and try and fail, I finally understand the correct way for version of the library is:
DS.Transforms.reopen({
'App.Person': {
fromData: function(serialized) {
if(serialized == undefined) return undefined;
var o = JSON.parse(serialized);
return App.Person.create(o);
},
toData: function(deserialized) {
return JSON.stringify(deserialized);
}
}
});
And I can write the corresponding Qunit test.