Migration from AppFarm to Traditional Code – Data Strategy Discussion

Hey everyone

We’re currently planning a move away from AppFarm to a more traditional code-based setup, and I wanted to share what we’re doing and get some input from the community.

The main goal is to take all our existing data from AppFarm and move it into a proper database (most likely SQL). This will give us more control, flexibility, and make things easier to scale going forward.

Right now, we’re thinking through things like:

  • How to best extract data from AppFarm (API vs export)

  • Mapping the existing structure into a clean database schema

  • Setting up a proper migration pipeline

  • Making sure we don’t lose data or break relationships

  • Deciding between a full migration vs incremental sync during the transition

A few concerns on our side:

  • AppFarm’s data structure isn’t always straightforward

  • Keeping data consistent during the move

  • Handling larger datasets without performance issues

If anyone here has done something similar or has experience moving away from AppFarm (or any low-code platform), would really appreciate your thoughts:

  • What worked well for you?

  • Anything you’d do differently?

  • Any tools or approaches you’d recommend?

Hi, thanks for the open and well-structured post.

For data extraction, the Data Extract API is the most structured option generally speaking -object-class-level endpoints with cursor-based pagination, well-suited for a migration pipeline. GraphQL is an alternative if you need more query flexibility. Both require a service account with appropriate permissions and an API key.

On mapping to SQL, the main things to account for are multi-reference properties (cardinality many), stored as arrays and needing junction tables, and function-type properties, which are calculated at runtime and not stored — those will need to be re-implemented in your target system. More on the data structure in the Object Classes documentation.

Jan Einar - Appfarm Team

Hello, we’ve been doing it for replicating the data into another SQL database.
As far as I know, there are a couple of challenges with the graphQL/Data extract API:

  1. the enums full definition is only available with the services.
  2. You can not access the entire data model object class properties (unique, type, required,…). You can use the introspection graphql query but it wont really help you for the details. For example, the af_createdDate is considered as a string and note a datetime.