There aren't performance guarantees per se, because performance depends on the specification of server you are using.
We have tested streams in blockchains over 1 TB in size, with no observed problems. A stream write is just a regular blockchain transaction, so it doesn't care how many items have been written before.
In terms of reading, all stream retrieval APIs use indexes, so they will remain efficient even for streams with a huge number of items. Like any index you can expect the retrieval time to increase with the logarithm of the number of entries. So, very roughly, in going from 10,000 to 100,000,000 items, the retrieval time will double.