CRATO

Updated 572 days ago
  • ID: 43023302/33
With Kafka in place, Crato's central collection server sends plain text and JSON formatted log messages to Kafka brokers via 's module. Early tests of Crato with a single Kafka broker revealed that the broker would not always be available to receive messages. To reduce the potential for log loss, Crato is now built with a three broker Kafka cluster, thus preserving service even if two brokers fail. Additionally, it employs on its central server a disk-assisted memory queue to mitigate partitions between server and Kafka cluster, and to provide an emergency store of unsent messages if shuts down, or in the event that all Kafka brokers are down... So, with the first option too heavy-handed and risky, and the second too limiting, we opted to adapt , an already in place system-level logging daemon to collect from applications as well as from its default system processes. By doing so, Crato is able to create customized configurations that supplement, not override, the functioning of..
  • 0
  • 0
Interest Score
2
HIT Score
0.00
Domain
crato-logging.github.io

Actual
crato-logging.github.io

IP
185.199.108.153, 185.199.109.153, 185.199.110.153, 185.199.111.153

Status
OK

Category
Company
0 comments Add a comment