* Rewrite prompts to allow more diversity from openai models.
* Use two-phase mechanism for contextual prompt generation, i.e. create a writing prompt, then call openai with that prompt to generate text and an instruction related to the text.
* Get the responses separately, otherwise the output is far too concise and pointed to be useful.
* Support for custom topics as input file.
* Support for custom topic generation prompt, which allows category specific model fine-tuning.
* Configurable batch size.
* Better concurrency/queue management.