Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

log4stach elastic search appender memory leak #87

Open
pikmano opened this issue Jul 29, 2021 · 4 comments
Open

log4stach elastic search appender memory leak #87

pikmano opened this issue Jul 29, 2021 · 4 comments

Comments

@pikmano
Copy link

pikmano commented Jul 29, 2021

Hey,
I'm facing major issue with my .Net process. At some random point the http appender stucks and the logs are stored in memory (stops sending to Elastic) until the process is killing the server.

Did you face this problem? I'm "shipping" around 1million logs per second.

@pikmano pikmano changed the title log4stach elastic search appender mamory leak log4stach elastic search appender memory leak Jul 29, 2021
@the7thchakra
Copy link

I'm having a very similar issue.

@urielha
Copy link
Owner

urielha commented Dec 13, 2021

Can you share your configuration?

I think you can work around this if you ok with loosing few log messages in case of overflow

@the7thchakra
Copy link

I'd be happy with some lost messages right now!

If there isn't a way / plan for a flush-on-payload-size limit feature, that'd be useful. I'm using AWS OpenSearch and we're limited to 10Mb per message. With the settings below as well as some application-level message size capping, we're regularly blowing past that.

<?xml version="1.0" encoding="utf-8"?>
<log4net>
	<root>
		<level value="ALL" />
		<appender-ref ref="elasticsearch" />
	</root>
	<appender name="elasticsearch" type="log4stash.ElasticSearchAppender, log4stash">
		<Server><!-- redacted --></Server>
		<Port>443</Port>
		<AllowSelfSignedServerCert>False</AllowSelfSignedServerCert>
		<Ssl>True</Ssl>
		<AuthenticationMethod>
			<Basic>
				<Username><!-- redacted --></Username>
				<Password><!-- redacted --></Password>
			</Basic>
		</AuthenticationMethod>

		<IndexName>%{environment}-%{application}-%{log.level}-%{+yyyy-MM-dd}</IndexName>
		<IndexType>_doc</IndexType>
		<IndexAsync>False</IndexAsync>

		<BulkSize>256</BulkSize><!-- we're getting lots of payload too large messages -->
		<BulkIdleTimeout>5000</BulkIdleTimeout>
		<DropEventsOverBulkLimit>False</DropEventsOverBulkLimit>
		<ElasticSearchTimeout>90000</ElasticSearchTimeout><!-- we're getting connection rejected a lot from the server; I'd like to increase BulkSize, but with the payload too large messages, I don't think I can right now -->

		<SerializeObjects>True</SerializeObjects>

		<FixedFields>Partial</FixedFields>

		<ElasticFilters>
			<Remove>
				<Key>log4net:HostName</Key>
			</Remove>
			<Remove>
				<Key>log4net:Identity</Key>
			</Remove>

			<Rename>
				<Key>AppDomain</Key>
				<RenameTo>application</RenameTo>
			</Rename>
			<Rename>
				<Key>HostName</Key>
				<RenameTo>host.name</RenameTo>
			</Rename>
			<Rename>
				<Key>Level</Key>
				<RenameTo>log.level</RenameTo>
			</Rename>
			<Rename>
				<Key>log4net:UserName</Key>
				<RenameTo>UserName</RenameTo>
			</Rename>
			<Rename>
				<Key>LoggerName</Key>
				<RenameTo>log.logger</RenameTo>
			</Rename>
			<Rename>
				<Key>Message</Key>
				<RenameTo>message.text</RenameTo>
			</Rename>
			<Rename>
				<Key>ThreadName</Key>
				<RenameTo>thread.name</RenameTo>
			</Rename>
			<Rename>
				<Key>UserName</Key>
				<RenameTo>server.user.name</RenameTo>
			</Rename>
		</ElasticFilters>
	</appender>
</log4net>

@urielha
Copy link
Owner

urielha commented Dec 14, 2021

Hi,

We have a mechanism to drop events over the bulk limit, but you will have to play with it in order to achieve optimal solution.

Please read about DropEventsOverBulkLimit there: #77 (comment)

Basically you will have to set BulkIdleTimeout to a more smaller timeout (depends on your amount of log messages per seconds) and set DropEventsOverBulkLimit to true

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants