Token limits set the boundaries for how much text you can analyze in a single prompt and how detailed the AI’s response can be. These limits depend on whether you are using Discourse Analyzer Simple or Advanced, and which subscription plan you have.
Discourse Analyzer Simple: Input and Output Token Limits #
Monthly Plans:
- White Dwarf Monthly: 1000 tokens for input, 1000 tokens for output per prompt
- Blue Giant Monthly: 2000 tokens for input, 2000 tokens for output per prompt
- Red Giant Monthly: 3000 tokens for input, 3000 tokens for output per prompt
Yearly Plans:
- White Dwarf Yearly: 4000 tokens for input, 2000 tokens for output per prompt
- Blue Giant Yearly: 6000 tokens for input, 3000 tokens for output per prompt
- Red Giant Yearly: 10,000 tokens for input, 4000 tokens for output per prompt
These numbers set the maximum amount of text you can submit for analysis and the maximum length of the AI’s answer in each interaction. For discourse analysts, this means you can plan exactly how much material, such as interview transcripts, articles, or corpus extracts, you want to process at once and know how much detail you can expect back from the AI.
Discourse Analyzer Advanced: Large-Scale Token Limit #
Discourse Analyzer Advanced is built for deep, multi-source, and demanding analyses. Here, the token limit is set to 1,000,000 tokens per prompt.
What does 1 million tokens mean? #
To put this into perspective, 1,000,000 tokens is about 750,000 words. This is roughly:
- More than 10 standard academic books
- Around 1,000 scholarly articles of average length
- Hundreds of hours of interview transcripts
- Or an entire large research corpus in one go
This huge capacity allows you to analyze full datasets, large corpora, or multi-document projects without splitting your input into smaller parts.
The output length in Advanced is also flexible. The AI will always try to generate a detailed and coherent response, but the actual output size depends on your query, the amount of input, and the complexity of your request.
Why Token Limits Matter #
For discourse analysts, knowing your token limits is crucial. It helps you:
- Decide how much data to submit for analysis in one go
- Anticipate the level of detail in AI-generated interpretations
- Avoid interruptions or incomplete analyses when working with large datasets or long texts
- Manage your credits more effectively, since larger texts and responses use more resources
With a clear understanding of your token limits, you can organize your analytical workflow, segment your texts, and ensure smooth, uninterrupted use of Discourse Analyzer for both small and large projects.