The number of tokens produced by the tokenizer will be contingent on the language and the specific model used. However, as a general guideline, you can use the following word-to-token ratios: English: 1 word ≈ 1.3 tokens. Spanish: 1 word ≈ 2 tokens.
You receive about $5 each month to spend on tokens. The amount of tokens that is, depends on the LLM that you decide to use. I think it was about 3.3 million tokens when using GPT-3.5 Turbo