@Wizbyt wrote:
Our Ember application has to work with quite large data sets and we’ve found that trying to load 5000+ objects into the store impacts performance.
To test this we use an ajax GET to return 17000+ objects from our backend and display the results in ember-models-table, this takes about 1 second to grab the data and fully render the display.
Next we tried using the REST adapter to grab 13000+ objects, import them into the store and fully render the display, this takes 18 seconds.
We started using pagination from the backend to work around this problem but I can see us needing to return lots of objects at some point in the future, and also this would mean creating multiple endpoints that all pretty much do the same thing.
- How are other people working with large data sets?
- Is the JSONAPIadapter any quicker or does the bottleneck lie with ember-data importing into the store?
- Would rewriting our adapter improve performance? If so what should we be doing and how would it look? (see below for our very basic current adapter and serializer)
I’d like to stick with Ember if possible but if there’s performance issues with the store and large data sets we might have to evaluate other frameworks (which I’d prefer not to do!)
Adapter
import DS from 'ember-data'; import DataAdapterMixin from 'ember-simple-auth/mixins/data-adapter-mixin'; import { inject as service } from '@ember/service'; import config from '../config/environment'; export default DS.RESTAdapter.extend(DataAdapterMixin, { session: service(), authorize(xhr) { let { accessToken } = this.get('session.data.authenticated'); xhr.setRequestHeader('Authorization', `Bearer ${accessToken}`); }, host: config.hostname });
Serializer
import DS from 'ember-data'; export default DS.RESTSerializer.extend({ primaryKey: '_id', serializeId: function(id) { return id.toString(); } });
Posts: 2
Participants: 2