NetTalk Central

Author Topic: Multiple instances of an API-server?  (Read 3344 times)

AtoB

  • Jr. Member
  • **
  • Posts: 74
    • View Profile
    • Email
Multiple instances of an API-server?
« on: April 07, 2021, 04:02:08 AM »
Hi All,

I'm running into several issues with a heavily used api-server (Clarion runtimes/driver not being totally thread safe and I actually need multiple database connections to the database server, while Clarion only allows one per exe). I sometimes use critical sections around my method/endpoints to accomplish what I need and the circumvent the threadsafety issues, but when there is a heavy workload, things are being served sequentially instead of concurrently.

So my plan was to start several instances of the exe/api-server (listening on different ports?) to accomplish this, but I probably need some "router" software to manage traffic.

Does anybody have any experience on this?
Is this a good plan?
Is there any documentation for this in the Nettalk docs?
Where to start looking :-)?

TIA

regards,
Ton

Bruce

  • Global Moderator
  • Hero Member
  • *****
  • Posts: 11250
    • View Profile
Re: Multiple instances of an API-server?
« Reply #1 on: April 09, 2021, 03:57:44 AM »
Hi Ton,

>> while Clarion only allows one per exe

that's not strictly accurate. If you use a unique owner string for each connection you'll get multiple connections.

>> I probably need some "router" software to manage traffic.

the obvious approach is for the client to use an API call to get "one of the server names including port" and then use that for the subsequent requests. Then the clients can be shared amongst the servers.
Or a simple REDIRECT from the primary server to another server would also work.

Cheers
Bruce

bshields

  • Sr. Member
  • ****
  • Posts: 392
    • View Profile
    • Inhabit
    • Email
Re: Multiple instances of an API-server?
« Reply #2 on: April 12, 2021, 01:13:53 AM »
Hi,

I do this via either Elastic Load Balancer (when hosted on AWS) or in hardware on my fortinet routers.

Most enterprise routers support load balancing.

You just run numerous instances of your API on different ports or internal IPs, on the same or different servers. Then setup your load balancer, it appears as a single end point for the outside world and then routes to numerous actual worker servers.

If you share your infrastructure I can be more specific.

Regards
Bill