In a recent project, we had a REST API with GET
/profile
route, which returned profile data. Profile data needed to be taken from users
table as well as many other tables like: universities
, identities
, sectors
, companies
, question_answers
, can_help
. All of those subsequent tables are related to user with foreign key and initially there were N
network requests for each table query, which was taking O(N)
linear time to execute. After optimisations the time function was reduced to O(1)
constant time. The optimisation was to send queries in bulk (Multiple select statemets in a single network request) and extract each table afterwards from the DB response like so:
const [ query_tables ] = await db.read.raw(`
SELECT
un.id AS id,
un.name AS name
FROM universities un
INNER JOIN user_universities us