Thought I’d let you all know where I am. The good news is, I have the internal API basically complete (for publications) and working, and tested relatively thoroughly. The bad news is, I’m having trouble interfacing it with Arc. Some things are unusably slow, sometimes it doesn’t work at all. Part of the problem is that the main Arc app assumes publication access is cheap, and requests far more data than it needs. That needs to change, but that’s a big project I don’t want to get into yet.
So, my plan has changed, to make the Arc app call SQL directly.
Does this mean my work was wasted? No. The code to convert files to SQL is still needed, and the internal API work will directly translate to the external API. The SQL → JSON → s‑expression functions I wrote for the API are also useful for reading directly from SQL.
Why use an internal API at all? Service-oriented architecture has myriad benefits. Primarily, scalability, robustness, and portability. I’d be happy to discuss it in more detail, or you can ask Google. So, the current plan is to read and write from SQL directly, because Arc. Later, when the entire app is converted to Racket, we can re-apply the internal API and work on making everything more service-oriented.
More good news: this probably means the external API will be done faster.
My current plan:
⑴ Implement reading directly from SQL, into publications stored in memory.
• we want to make all data requests come immediately from the source (SQL/API), and then later use a caching solution like memcached if necessary. But for now, because the app requests more data than it needs, continuing to store in memory avoids performance issues.
• after the livestream, I now have the main app loading SQL, but it’s slow. I don’t know why. Current guesses: my slow network; poor database indices; simply too much data, need a single bulk query; Arc–Racket communication is slow. I’m hoping it’s the first. The last is worst-case scenario. in that case, this is going to be hard, because Arc itself has no database functions. Bulk query is ugly, but doable. We’ll see.
⑵ Perform the conversion to SQL
• the code for this is done, it just needs run on the live data, once the app reading from SQL is implemented.
⑶ Add functions to the internal API, for external requests
⑷ Various other things necessary for a complete external API, like login for private mail.
⑸ Fix Arc app to only request the data it needs from SQL, when it needs it.
⑹ Change Arc app to not cache anything in memory.
⑺ Convert Arc app to Racket, one function at a time.
⑻ Change converted main Racket app to use the internal API.
Right now, converting the main app to SQL is my #1 priority. That opens countless doors, not least of which the external API. Steps 3–7 may occur in any order. But I’m likely to prioritise the external API. Everyone wants it, including me.