Serilog with Graylog not logging
I have a Web API and I'm trying to log messages into Graylog, using Serilog. Now matter what I do, no messages are shown in my Graylog application. This is what I have so far:
This is in my Program.cs
var logger = new LoggerConfiguration().
ReadFrom.Configuration(builder.Configuration).
Enrich.FromLogContext().
CreateLogger();
builder.Logging.ClearProviders();
builder.Logging.AddSerilog(logger);
This is my configuration:
"Serilog": {
"Using": [ "Serilog.Sinks.Graylog" ],
"MinimumLevel": "Information",
"WriteTo": [
{
"Name": "Graylog",
"Args": {
"hostnameOrAddress": "127.0.0.1",
"port": "12201",
"transportType": "Udp"
}
}
],
"Properties": {
"Application": "Centralized logging application"
}
},
"AllowedHosts": "*"
}
And I'm trying to log a:
_logger.LogError(0, new Exception("Exception Message"), "Message", new WeatherForecast());
Can someone please help me? I need to see my exception inside Graylog.
Thanx a lot in advance
do you know?
how many words do you know
See also questions close to this topic
-
C# - Adding condition to func results in stack overflow exception
I have a func as part of specification class which sorts the given iqueryable
Func<IQueryable<T>, IOrderedQueryable<T>>? Sort { get; set; }
When i add more than one condition to the func like below , it results in stack overflow exception.
spec.OrderBy(sc => sc.Case.EndTime).OrderBy(sc => sc.Case.StartTime);
The OrderBy method is implemented like this
public ISpecification<T> OrderBy<TProperty>(Expression<Func<T, TProperty>> property) { _ = Sort == null ? Sort = items => items.OrderBy(property) : Sort = items => Sort(items).ThenBy(property); return this; }
Chaining or using separate lines doesn't make a difference.
This problem gets resolved if I assign a new instance of the specification and set it's func, but i don't want to be assigning to a new instance everytime. Please suggest what am i missing here and how to reuse the same instance (if possible).
-
How to projection fields for a dictionary (C#, MongdoDB)
I am trying my luck here, I have a model which is like the following
public class RowData : BaseBsonDefinition { . [BsonExtraElements] [BsonDictionaryOptions(DictionaryRepresentation.ArrayOfDocuments)] public Dictionary<string, object> Rows { get; set; } = new(StringComparer.OrdinalIgnoreCase); . }
In result, the schema in the MongoDB looks like
{ "_id": { "$binary": { "base64": "HiuI1sgyT0OZmcgGUit2dw==", "subType": "03" } }, "c1": "AAA", "c8": "Fully Vac", "c10": "", }
Those c1, c8 and c10 fields are keys from the dictionary, my question is how to dynamic project those fields?
I tried
Builders<RowData>.Projection.Exclude(p => "c1")
It seems the MongoDB driver can not handle a value directly.
Anyone could point me in the correct direction?
Thanks,
-
How do I add new DataSource to an already Databinded CheckBoxList
i'm building a web form that show Database's item(Tables, Rows, FK,...)
I have a CheckBoxList of Tables (
chkListTable
) which will show a new CheckBoxList of Rows (chkListRow
) everytime I SelectedIndexChanged fromchkListTable
. The problem is i can show the items fromchkListTable
with 1 selected item. But i don't know how to showchkListRow
if multiple item fromchkListTable
are selected.Here are my codes:
aspx
:<div> <asp:Label ID="Label2" runat="server" Text="Table: "></asp:Label> <asp:CheckBoxList ID="chkListTable" runat="server" DataTextField="name" DataValueFeild="name" AutoPostBack="true" OnSelectedIndexChanged="chkListTable_SelectedIndexChanged"> </asp:CheckBoxList> </div> <div> <asp:CheckBoxList ID="chkListRow" runat="server" DataTextField="COLUMN_NAME" DataValueField="COLUMN_NAME" RepeatDirection="Horizontal"> </asp:CheckBoxList> </div>
aspx.cs
:protected void chkListTable_SelectedIndexChanged(object sender, EventArgs e) { tableName.Clear(); foreach (ListItem item in chkListTable.Items) { if(item.Selected) { tableName.Add(item.Text.Trim()); } } for(int i = 0; i < tableName.Count; i++) { String query = "USE " + dbname + " SELECT * FROM information_schema.columns" + " WHERE table_name = '" + tableName[i] + "'" + " AND COLUMN_NAME != 'rowguid'"; chkListRow.DataSource = Program.ExecSqlDataReader(query); chkListRow.DataBind(); Program.conn.Close(); } }
Program.cs
:public static bool Connect() { if (Program.conn != null && Program.conn.State == ConnectionState.Open) Program.conn.Close(); try { Program.conn.ConnectionString = Program.constr; Program.conn.Open(); return true; } catch (Exception e) { return false; } } public static SqlDataReader ExecSqlDataReader(String query) { SqlDataReader myreader; SqlCommand sqlcmd = new SqlCommand(query, Program.conn); sqlcmd.CommandType = CommandType.Text; if (Program.conn.State == ConnectionState.Closed) Program.conn.Open(); try { myreader = sqlcmd.ExecuteReader(); return myreader; myreader.Close(); } catch (SqlException ex) { Program.conn.Close(); return null; } }
I want my display to be like this:
[x]Table1 [x]Table2 [ ]Table3 [ ]Row1(Table1) [ ]Row2(Table1) [ ]Row3(Table1) [ ]Row1(Table2) [ ]Row2(Table2)
-
Fluent Validation with conditions in asp.net core
I am working in asp. net core 6.0 web API project (clean architecture (CQRS)).
I am using fluent validation for validate command and queries.
public class CreateSiteDestinationSectionsCommand : IRequest<x> { public int DestinationSectionId { get; set; } public int DestinationSectionTitleId { get; set; } public int SiteCodeId { get; set; } public string Description { get; set; } public List<DestinationImageDto> Images { get; set; } public List<string> Links { get; set; } }
This is i did inside CreateSiteDestinationSectionsCommandHandler.
var DestinationSectionTitleId = request.DestinationSectionTitleId; if (DestinationSectionTitleId != 10) { if (DestinationSectionTitleId == 1 || DestinationSectionTitleId == 2 || DestinationSectionTitleId == 5 || DestinationSectionTitleId == 7 || DestinationSectionTitleId == 8 || DestinationSectionTitleId == 9 || DestinationSectionTitleId == 11) { var sectionimageCount = request.Images.Count; if (sectionimageCount != 1) { throw new ApiValidationException("Section has not more than one image"); } else (DestinationSectionTitleId == 10 && request.Images != null) { var sectionimageCount = request.Images.Count; if (sectionimageCount != 0) { throw new ApiValidationException("Section doesnot have any image"); } } } }
But instead of handling validation inside commandHandler, I have to handle validation in CreateSiteDestinationSectionsCommandValidator.
I tried this,
RuleFor(x => x.Images) .Must(x => x != null) .When(x => x.DestinationSectionId != 10) .WithMessage("Site Section image required"); RuleFor(x => x.DestinationSectionId).NotEmpty(); RuleFor(x => x.Images) .Must(x => x.Count != 1) .When(x => x.DestinationSectionId == 1 && x.DestinationSectionId == 1 && x.DestinationSectionId == 2 && x.DestinationSectionId == 5 && x.DestinationSectionId == 7 && x.DestinationSectionId == 8 && x.DestinationSectionId == 9 && x.DestinationSectionId == 11 ) .WithMessage("Site Section has not more than one image"); }
When I check throug postman request, Even I send with DestinationSectionId = 10 (and not sending any images), I got validation error as
"errors": { "Images": [ "Site Section image required" ] }
And Even I send more than 1 images for DestinationSectionId = 1, I did not get validation error. BUT I shoud get validation error as
Site Section has not more than one image
Why this validations not work correctly? What I missed?
-
Using RestSharp to request a file fails with memory issue
I have to API's talking to each other on Kubernetes. The first API asks the second API for a small file using RestSharp (in ASP.NET). The file is 8Kb so basically not large at all.
Yet i get the following message on the API that wants to receive the file:
Exception of type 'System.OutOfMemoryException' was thrown. at RestSharp.RestClient.ThrowIfError(RestResponse response) at RestSharp.RestClientExtensions.GetAsync(RestClient client, RestRequest request, CancellationToken cancellationToken) at Aftermarket.Server.AdministrationService.Server.Services.Services.RunSessionService.DownloadRunSession(String foldername) in /src/Aftermarket.Server.DbApi/Server/Services/Services/RunSessionService.cs:line 59
The code U use to call the other API and ask for the file looks as follows:
public async Task<byte[]> DownloadRunSession(string foldername) { try { var request = new RestRequest($"{config["WebApiServer:WebApiServerUrl"]}/blob/{foldername}"); var response = await client.GetAsync(request); if (!response.IsSuccessful) { Console.WriteLine("File download failed"); Console.WriteLine(response.StatusCode); return null; } return response.RawBytes; }catch(Exception e) { Console.WriteLine(e.Message); Console.WriteLine(e.StackTrace); return null; } }
The API that responds by sending the file has the following controller method:
public IActionResult GetBlob([FromQuery] string folderName, [FromServices] GetBlobsService _getBlobsService) { _logger.Info(folderName); Guid guid = Guid.NewGuid(); if (_env.EnvironmentName == "dev" || _env.EnvironmentName == "prod") { byte[] blob = _getBlobsService.GetBlob(folderName, guid, _logger); if (blob == null) return this.NotFound(); else { return File(blob, "application/force-download", guid + ".zip"); } } else { return Content("This function is only available in Dev/Uat Environment"); } }
Any one have any idea how a 8Kb file is causing this issue?
-
How is Windows Authentication Wired Up?
I'm in the process of creating an ASPNET Core 6 MVC app in VS 2022 which will eventually be deployed in a Docker container. Windows Authentication will be used as it's an internal app. When creating this project from scratch with File -> New Project -> ASP.NET Core 6 Web App (Model-View-Controller) and with Windows Authentication enabled, everything works as expected.
This is where it gets weird. Since this will be a Docker container on Linux, I commented out the IIS settings in launchSettings.json.
{ //"iisSettings": { // "windowsAuthentication": true, // "anonymousAuthentication": false, // "iisExpress": { // "applicationUrl": "http://localhost:60583", // "sslPort": 44391 // } //}, "profiles": { "ASPNETCORE6": { "commandName": "Project", "dotnetRunMessages": true, "launchBrowser": true, "applicationUrl": "https://localhost:7276;http://localhost:5276", "environmentVariables": { "ASPNETCORE_ENVIRONMENT": "Development" } }, "IIS Express": { "commandName": "IISExpress", "launchBrowser": true, "environmentVariables": { "ASPNETCORE_ENVIRONMENT": "Development" } } } }
To recreate this issue, comment out the following from the generated Program.cs file:
//using Microsoft.AspNetCore.Authentication.Negotiate; //builder.Services.AddAuthentication(NegotiateDefaults.AuthenticationScheme) // .AddNegotiate(); //builder.Services.AddAuthorization(options => //{ // options.FallbackPolicy = options.DefaultPolicy; //}); //app.UseAuthentication(); //app.UseAuthorization();
Now, when I run a debugging session using IIS Express, the browser windows renders with my correct domain name being displayed. How is this even possible? Is something being cached? Also, the browser is going to http://localhost:44391/ when debugging with IIS Express even though this is commented out in launchSettings.json.
Note: Windows authentication is not working when I debug with Kestrel, which is what I would expect.
-
AppInsights telemetry not co-existing with SeriLog logging
my Program.cs is configured like this
using Infrastructure; using Microsoft.AspNetCore.Authentication; using Microsoft.AspNetCore.Authentication.AzureAD.UI; using Microsoft.AspNetCore.Authentication.JwtBearer; using Microsoft.EntityFrameworkCore; using Microsoft.Identity.Web; using Microsoft.Identity.Web.TokenCacheProviders.InMemory; using Serilog; using Serilog.Events; using Services; using Services.Contracts; var MyAllowSpecificOrigins = "_myAllowSpecificOrigins"; var builder = WebApplication.CreateBuilder(args); //builder.Services.AddApplicationInsightsTelemetry(opt => opt.EnableActiveTelemetryConfigurationSetup = true); var logger = new LoggerConfiguration() // .MinimumLevel.Debug() .MinimumLevel.Override("Microsoft", LogEventLevel.Information) // Filter out ASP.NET Core infrastructre logs that are Information and below .MinimumLevel.Override("Microsoft.AspNetCore", LogEventLevel.Warning) .ReadFrom.Configuration(builder.Configuration) .Enrich.FromLogContext() .CreateLogger(); try { // clear all exsiting logging proveriders builder.Logging.ClearProviders(); builder.Logging.AddSerilog(logger); builder.Services.AddApplicationInsightsTelemetry(); var configuration = new ConfigurationBuilder() .SetBasePath(Directory.GetCurrentDirectory()) .AddJsonFile("appsettings.json") .Build(); //builder.Services.AddAuthentication(AzureADDefaults.BearerAuthenticationScheme) // .AddAzureADBearer(options => configuration.Bind("AzureAd", options)); builder.Services.AddMicrosoftIdentityWebApiAuthentication(configuration, "AzureAd"); builder.Services.AddCors((options => { options.AddPolicy(MyAllowSpecificOrigins, builder => builder .WithOrigins("http://localhost:4200", "Access-Control-Allow-Origin", "Access-Control-Allow-Credentials") .AllowAnyMethod() .AllowAnyHeader() .AllowCredentials() ); })); var connectionString = configuration.GetConnectionString("BBBankDBConnString"); // Add services to the container. builder.Services.AddControllers(); builder.Services.AddScoped<ITransactionService, TransactionService>(); builder.Services.AddScoped<DbContext, BBBankContext>(); builder.Services.AddDbContext<BBBankContext>( b => b.UseSqlServer(connectionString) .UseLazyLoadingProxies(true) ); var app = builder.Build(); // Configure the HTTP request pipeline. app.UseCors(MyAllowSpecificOrigins); app.UseAuthentication(); app.UseAuthorization(); app.MapControllers(); app.Run(); } catch (Exception ex) { logger.Fatal(ex, "Error Starting BBBank API"); } finally { logger.Dispose(); }
and the settings are like this
{ "Logging": { "LogLevel": { "Default": "Information", "Microsoft.AspNetCore": "Warning" } }, "AllowedHosts": "*", "ConnectionStrings": { "BBBankDBConnString": "Server=tcp:xxxx.database.windows.net,1433;Initial Catalog=BBBankDB;Persist Security Info=False;User ID=xxxx;Password=xxxx;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;" }, "AzureAd": { "Instance": "https://login.microsoftonline.com/", "Domain": "bbbankAD.onmicrosoft.com", "TenantId": "0c087d99-9bb7-41d4-bd58-80846660b536", "ClientId": "api://bbbankapi", "Audience": "api://bbbankapi" }, // To configure app insight custom events (custom events for User Specific Actions (User Accessed Accounts Data)) "ApplicationInsights": { "InstrumentationKey": "xxx-ecc6-43d0-9fab-a0d996f4cf07", "EnableAdaptiveSampling": false, "EnablePerformanceCounterCollectionModule": false }, // To Configure App Insights logging (Applucation Specific Logs e.g Controller Hit) "Serilog": { "Using": [ "Serilog.Sinks.ApplicationInsights", "Serilog.Sinks.File" ], // "Serilog.Sinks.Debug", "Serilog.Sinks.File", "MinimumLevel": "Information", // Where do we want to write our logs to? Choose from a large number of sinks: // https://github.com/serilog/serilog/wiki/Provided-Sinks. "WriteTo": [ //{ // "Name": "Debug" //}, { "Name": "File", "Args": { "path": "Logs/log.txt" } }, { "Name": "ApplicationInsights", "Args": { "instrumentationKey": "xxx-ecc6-43d0-9fab-a0d996f4cf07", //"restrictedToMinimumLevel": "Information", "telemetryConverter": "Serilog.Sinks.ApplicationInsights.Sinks.ApplicationInsights.TelemetryConverters.TraceTelemetryConverter, Serilog.Sinks.ApplicationInsights" } } ], "Enrich": [ "FromLogContext", "WithMachineName", "WithThreadId" ], "Properties": { "Application": "Sample" } } }
and the packages i have are these ones
<ItemGroup> <PackageReference Include="Microsoft.AspNetCore.Authentication.AzureAD.UI" Version="6.0.4" /> <PackageReference Include="Microsoft.EntityFrameworkCore" Version="6.0.3" /> <PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="6.0.3"> <PrivateAssets>all</PrivateAssets> <IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets> </PackageReference> <PackageReference Include="Microsoft.EntityFrameworkCore.Proxies" Version="6.0.3" /> <PackageReference Include="Microsoft.EntityFrameworkCore.SqlServer" Version="6.0.3" /> <PackageReference Include="Microsoft.Identity.Web.MicrosoftGraph" Version="1.23.1" /> <PackageReference Include="Microsoft.ApplicationInsights.AspNetCore" Version="2.20.0" /> <PackageReference Include="Serilog.AspNetCore" Version="5.0.0" /> <PackageReference Include="Serilog.Sinks.ApplicationInsights" Version="3.1.0" /> <!--<PackageReference Include="Serilog.Enrichers.Environment" Version="2.1.3" /> <PackageReference Include="Serilog.Enrichers.Thread" Version="3.1.0" /> <PackageReference Include="Serilog.Sinks.Async" Version="1.4.0" />-->
Problem: but the telemetry is working but logging is not working . I have tried different combinations of packages as well. Also tried commenting out one over the other but results are stil the same.
using Microsoft.ApplicationInsights; using Microsoft.AspNetCore.Mvc; namespace BBBankAPI.Controllers { [ApiController] [Route("[controller]")] public class WeatherForecastController : ControllerBase { private static readonly string[] Summaries = new[] { "Freezing", "Bracing", "Chilly", "Cool", "Mild", "Warm", "Balmy", "Hot", "Sweltering", "Scorching" }; private readonly ILogger<WeatherForecastController> _logger; private readonly TelemetryClient telemetryClient; public WeatherForecastController(ILogger<WeatherForecastController> logger, TelemetryClient telemetryClient) { _logger = logger; this.telemetryClient = telemetryClient; } [HttpGet] public IEnumerable<WeatherForecast> Get() { _logger.LogInformation("This is not showing in Azure portal but showing in file"); telemetryClient.TrackTrace("This is showing in Azure Portal"); return Enumerable.Range(1, 5).Select(index => new WeatherForecast { Date = DateTime.Now.AddDays(index), TemperatureC = Random.Shared.Next(-20, 55), Summary = Summaries[Random.Shared.Next(Summaries.Length)] }) .ToArray(); } } }
-
Amazon S3 sink in serilog is not working. How to use s3 Sink with serilog?
public class program { public static void Main(String args[]) { var levelSwitch = new LoggingLevelSwitch(); levelSwitch.MinimumLevel = LogEventLevel.Information; try { var logger = new LoggerConfiguration().WriteTo .AmazonS3( "log.txt", "xxxxxxx", //bucketName Amazon.RegionEndpoint.EUWest1, "xxxx", //accessKey "xxxx", //secretKey restrictedToMinimumLevel: LogEventLevel.Information, outputTemplate: "{Timestamp:yyyy-MM-dd HH:mm:ss.fff zzz} [{Level:u3}] {Message:lj}{NewLine}{Exception}", new CultureInfo("de-DE"), levelSwitch: levelSwitch, rollingInterval: RollingInterval.Minute, encoding: Encoding.Unicode ) .CreateLogger(); logger.Debug("Hello world Debug mode on"); logger.Information("Hello world Debug mode on"); logger.Error("Hello world Debug mode on"); } catch (Exception ex) { Console.WriteLine(ex.ToString()); } } }
and if I remove output template and culture then I face the ambiguity error. How can I use this amazon s3 sink correctly ?. and I am not getting any error in the catch block and nothing happen in the s3 bucket there was no file is making
-
Graylog Log Data in Interface
Can we able to see full log data in the Interface of Graylog at a single time or not. Or we can see only some amount of the data. What is the maximum quantity of the log data we can see in the Graylog Interface at once?
-
What are the different Logtemplates I have to use?
Setting up my Graylog instance I'm quite confused what the different Templates are I got to use.
On Debian 10 I added
/etc/rsyslog.d/graylog.conf
with following value:*.* @graylog.i.abc.de:13526;RSYSLOG_SyslogProtocol23Format
and everything is getting awesome parsed whats in /var/log. Now seeing Serviio uses a different LOG pattern:
2022-04-06T15:44:57,701 INFO [PlaylistMaintainerWorker] Started looking for playlist changes
And Plex aswell:
Apr 01, 2022 22:33:31.142 [0x7fb0bcb3bb38] INFO - Plex DLNA Server v1.25.8.5663-e071c3d62 - Debian GNU/Linux PC x86_64 - build: linux-x86_64 - GMT 02:00
I'm scratching my head and can't find anything relevant in Google. What I found out is that you can write your own rsyslogd Templates. But I'm quite sure that the Formats that I shown you are "standarized", aren't they? If so, can someone give me a hint how they're called and where I'm able to find a list of them?
-
Unable to build the graylog web plugin using "mvn package"
I have downloaded the latest version of plugin and trying to build it using "mvn package" command. I am getting below error. Please help me to resolve the issue.
[INFO] [webpack-cli] Failed to load '/Users/divagu/Desktop/graylog-plugins/plugins-4.0/graylog-plugin-correlation-count/webpack.config.js' config [INFO] [webpack-cli] TypeError [ERR_INVALID_ARG_TYPE]: The "path" argument must be of type string. Received undefined
Environment: Graylog 4.2.7
Node.js v17.7.2
Yarn v1.22.18
OS macPlease help me in resolving this issue