I'm trying to build a PaaS like Ably where I provide users with a easy to use pub/sub system. The thing is that I'm planning to use Kafka but I don't know if it's the right fit for this. Each user can have any number of apps in the PaaS and each app will receive different messages and what I thought was that each app in the PaaS would have a topic in Kafka but the number of apps can grow to millions or even billions if I get a lot of users and Kafka isn't fit for this many topics.
Should I use Kafka for this or look into something else? Maybe there's some other way of separating messages between apps that I don't know of. I can't just put everything into a single topic because then I'd receive trillions of unnecessary messages on the nodes.
For your kafka question part :
Update March 2021: With Kafka's new KRaft mode (short for "Kafka Raft Metadata mode"; in Early Access as of Kafka v2.8), which entirely removes ZooKeeper from Kafka's architecture, a Kafka cluster can handle millions of topics/partitions. See https://www.confluent.io/blog/kafka-without-zookeeper-a-sneak-peek/ for details.
As the above feature is not yet architecture recommended for production usage current limit is thousands of topics/partitions in a kafka cluster which is backed by zookeeper
If you would want to provide some service to other applications and customer it is better to provide different topic so you could leverage authentication and authorization mechanism to avoid users to have access to other users data.