I understand that issue, but what does it have to do with cURL? That issue is caused by either file size limits or script timeout limits, not cURL issues.
So what is this cURL issue you’re referring to?
This. Getting a wildcard cert for 1 domain is trivial. Getting wildcard certs for many thousands of resellers, many of which have multiple domain names, is hard.
Again:
Again: WE ARE WORKING HARD TO GET THIS. You want this badly. We want this badly. iFastNet wants this badly. Everyone wants this badly.
But simply wanting something does not make it easier to get it.
Free SSL for all domains is our goal, and is iFastNet’s goal. And WE ARE TRYING VERY, VERY HARD TO GET THIS, but we can’t.
That’s not a solution. If it was that simple, we would have done it from the start.
Getting our tool to allow subdomains is easy. In fact, blocking subdomains was implemented purposely, and removing that check can be done in minutes.
The thing is that WE don’t want to block people from getting SSL certs for their subdomains. The issue is that Let’s Encrypt doesn’t let us.
The reason why we implemented this restriction is because Let’s Encrypt will just not provide these certificates. That’s not something that can be fixed on the InfinityFree or iFastNet levels.
The problem is Let’s Encrypt rate limits. There is a limit on how many certificates can be created for a single domain. If you exceed the limit, no more certificates can be creation.
This limit is enough for “normal” domains, but really nowhere near enough for tens or hundreds of thousands of subdomains.
I looked into that option too. The certificates seem to come from some small Latvian provider. Their APIs are limits are not publicly documented, so it’s hard to say if they are a viable option.
Please do not that their supplier may also have rate limits, but not enough people know about this provider to hit those limits. If we start to officially integrate their service, we may hit those limits too and still won’t be able to get those certificates.