(feature): thread safe file and json file logging, deep clone and compare of objects

This commit is contained in:
Maksym Sadovnychyy 2025-11-01 19:22:32 +01:00
parent 1fba73f690
commit 28a698a03b
14 changed files with 1024 additions and 79 deletions

137
README.md
View File

@ -13,6 +13,10 @@
- [DataTable Extensions](#datatable-extensions) - [DataTable Extensions](#datatable-extensions)
- [Guid Extensions](#guid-extensions) - [Guid Extensions](#guid-extensions)
- [Logging](#logging) - [Logging](#logging)
- [File Logger](#file-logger)
- [JSON File Logger](#json-file-logger)
- [Threading](#threading)
- [Lock Manager](#lock-manager)
- [Networking](#networking) - [Networking](#networking)
- [Network Connection](#network-connection) - [Network Connection](#network-connection)
- [Ping Port](#ping-port) - [Ping Port](#ping-port)
@ -511,33 +515,75 @@ string result = "example".Left(3); // "exa"
#### Object Extensions #### Object Extensions
The `ObjectExtensions` class provides methods for serializing objects to JSON strings and deserializing JSON strings back to objects. The `ObjectExtensions` class provides advanced methods for working with objects, including serialization, deep cloning, and structural equality comparison.
--- ---
#### Features #### Features
1. **JSON Serialization**: 1. **JSON Serialization**:
- Convert objects to JSON strings. - Convert objects to JSON strings with optional custom converters.
2. **JSON Deserialization**: 2. **Deep Cloning**:
- Convert JSON strings back to objects. - Create a deep clone of an object, preserving reference identity and supporting cycles.
3. **Structural Equality**:
- Compare two objects deeply for structural equality, including private fields.
4. **Snapshot Reversion**:
- Revert an object to a previous state by copying all fields from a snapshot.
--- ---
#### Example Usage #### Example Usage
##### Serialization ##### JSON Serialization
```csharp ```csharp
var person = new { Name = "John", Age = 30 }; var person = new { Name = "John", Age = 30 };
string json = person.ToJson(); string json = person.ToJson();
// With custom converters
var converters = new List<JsonConverter> { new CustomConverter() };
string jsonWithConverters = person.ToJson(converters);
``` ```
##### Deserialization ##### Deep Cloning
```csharp ```csharp
var person = json.ToObject<Person>(); var original = new Person { Name = "John", Age = 30 };
var clone = original.DeepClone();
``` ```
##### Structural Equality
```csharp
var person1 = new Person { Name = "John", Age = 30 };
var person2 = new Person { Name = "John", Age = 30 };
bool areEqual = person1.DeepEqual(person2); // True
```
##### Snapshot Reversion
```csharp
var snapshot = new Person { Name = "John", Age = 30 };
var current = new Person { Name = "Doe", Age = 25 };
current.RevertFrom(snapshot);
// current.Name is now "John"
// current.Age is now 30
```
---
#### Best Practices
1. **Use Deep Cloning for Complex Objects**:
- Ensure objects are deeply cloned when working with mutable reference types.
2. **Validate Structural Equality**:
- Use `DeepEqual` for scenarios requiring precise object comparisons.
3. **Revert State Safely**:
- Use `RevertFrom` to safely restore object states in tracked entities.
--- ---
#### DataTable Extensions #### DataTable Extensions
@ -599,24 +645,26 @@ The `Logging` namespace provides a custom file-based logging implementation that
--- ---
### File Logger
The `FileLogger` class in the `MaksIT.Core.Logging` namespace provides a simple and efficient way to log messages to plain text files. It supports log retention policies and ensures thread-safe writes using the `LockManager`.
#### Features #### Features
1. **File-Based Logging**: 1. **Plain Text Logging**:
- Log messages to a specified file. - Logs messages in a human-readable plain text format.
2. **Log Levels**: 2. **Log Retention**:
- Supports all standard log levels. - Automatically deletes old log files based on a configurable retention period.
3. **Thread Safety**: 3. **Thread Safety**:
- Ensures thread-safe writes to the log file. - Ensures safe concurrent writes to the log file using the `LockManager`.
---
#### Example Usage #### Example Usage
```csharp ```csharp
var services = new ServiceCollection(); var services = new ServiceCollection();
services.AddLogging(builder => builder.AddFile("logs.txt")); services.AddLogging(builder => builder.AddFileLogger("logs", TimeSpan.FromDays(7)));
var logger = services.BuildServiceProvider().GetRequiredService<ILogger<FileLogger>>(); var logger = services.BuildServiceProvider().GetRequiredService<ILogger<FileLogger>>();
logger.LogInformation("Logging to file!"); logger.LogInformation("Logging to file!");
@ -624,6 +672,65 @@ logger.LogInformation("Logging to file!");
--- ---
### JSON File Logger
The `JsonFileLogger` class in the `MaksIT.Core.Logging` namespace provides structured logging in JSON format. It is ideal for machine-readable logs and integrates seamlessly with log aggregation tools.
#### Features
1. **JSON Logging**:
- Logs messages in structured JSON format, including timestamps, log levels, and exceptions.
2. **Log Retention**:
- Automatically deletes old log files based on a configurable retention period.
3. **Thread Safety**:
- Ensures safe concurrent writes to the log file using the `LockManager`.
#### Example Usage
```csharp
var services = new ServiceCollection();
services.AddLogging(builder => builder.AddJsonFileLogger("logs", TimeSpan.FromDays(7)));
var logger = services.BuildServiceProvider().GetRequiredService<ILogger<JsonFileLogger>>();
logger.LogInformation("Logging to JSON file!");
```
---
## Threading
### Lock Manager
The `LockManager` class in the `MaksIT.Core.Threading` namespace provides a robust solution for managing concurrency and rate limiting. It ensures safe access to shared resources in multi-threaded or multi-process environments.
#### Features
1. **Thread Safety**:
- Ensures mutual exclusion using a semaphore.
2. **Rate Limiting**:
- Limits the frequency of access to shared resources using a token bucket rate limiter.
3. **Reentrant Locks**:
- Supports reentrant locks for the same thread.
#### Example Usage
```csharp
var lockManager = new LockManager();
await lockManager.ExecuteWithLockAsync(async () => {
// Critical section
Console.WriteLine("Executing safely");
});
lockManager.Dispose();
```
---
## Networking ## Networking
### Network Connection ### Network Connection

View File

@ -118,5 +118,199 @@ namespace MaksIT.Core.Tests.Extensions {
// Assert // Assert
Assert.Equal("{}", result); Assert.Equal("{}", result);
} }
// ------- DeepClone / DeepEqual / RevertFrom tests below -------
private class Person {
public string Name = "";
public int Age;
private string _secret = "xyz";
public string Secret => _secret;
public void SetSecret(string s) { _secret = s; }
public Address? Addr;
}
private class Address {
public string City = "";
public Person? Owner; // cycle back to person
}
private struct Score {
public int A;
public List<Person>? People; // ref-type field inside struct
}
[Fact]
public void DeepClone_WithSimpleGraph_ShouldProduceIndependentCopy() {
// Arrange
var p = new Person { Name = "Alice", Age = 25, Addr = new Address { City = "Rome" } };
// Act
var clone = p.DeepClone();
// Assert
Assert.NotSame(p, clone);
Assert.Equal("Alice", clone.Name);
Assert.Equal(25, clone.Age);
Assert.NotSame(p.Addr, clone.Addr);
Assert.Equal("Rome", clone.Addr!.City);
// Mutate clone should not affect original
clone.Name = "Bob";
clone.Addr.City = "Milan";
clone.SetSecret("new");
Assert.Equal("Alice", p.Name);
Assert.Equal("Rome", p.Addr!.City);
Assert.Equal("xyz", p.Secret);
}
[Fact]
public void DeepClone_ShouldPreserveCyclesAndReferenceIdentity() {
// Arrange
var p = new Person { Name = "Root" };
var a = new Address { City = "Naples" };
p.Addr = a;
a.Owner = p; // create cycle
// Act
var clone = p.DeepClone();
// Assert
Assert.NotSame(p, clone);
Assert.NotSame(p.Addr, clone.Addr);
Assert.Same(clone, clone.Addr!.Owner); // cycle preserved in clone
}
[Fact]
public void DeepClone_ShouldHandleStructsWithReferenceFields() {
// Arrange
var s = new Score {
A = 7,
People = new List<Person> { new Person { Name = "P1" } }
};
// Act
var sClone = s.DeepClone();
// Assert
Assert.Equal(7, sClone.A);
Assert.NotSame(s.People, sClone.People);
Assert.NotSame(s.People![0], sClone.People![0]);
Assert.Equal("P1", sClone.People[0].Name);
}
[Fact]
public void DeepClone_ShouldHandleArraysAndMultiDimensional() {
// Arrange
var arr = new[] { new Person { Name = "A" }, new Person { Name = "B" } };
var md = (Person[,])Array.CreateInstance(typeof(Person), new[] { 1, 2 }, new[] { 1, 1 });
md[1, 1] = arr[0];
md[1, 2] = arr[1];
// Act
var arrClone = arr.DeepClone();
var mdClone = md.DeepClone();
// Assert
Assert.NotSame(arr, arrClone);
Assert.NotSame(arr[0], arrClone[0]);
Assert.Equal("A", arrClone[0].Name);
Assert.NotSame(md, mdClone);
Assert.Equal(md.GetLowerBound(0), mdClone.GetLowerBound(0));
Assert.Equal(md.GetLowerBound(1), mdClone.GetLowerBound(1));
Assert.NotSame(md[1, 1], mdClone[1, 1]);
Assert.Equal("A", mdClone[1, 1].Name);
}
[Fact]
public void DeepClone_ShouldReturnSameReferenceForImmutable() {
// Arrange
var s = "hello";
// Act
var s2 = s.DeepClone();
// Assert
Assert.Same(s, s2);
}
[Fact]
public void DeepEqual_ShouldReturnTrue_ForEqualGraphs() {
// Arrange
var p1 = new Person { Name = "Alice", Age = 30, Addr = new Address { City = "Turin" } };
var p2 = new Person { Name = "Alice", Age = 30, Addr = new Address { City = "Turin" } };
// Act
var equal = p1.DeepEqual(p2);
// Assert
Assert.True(equal);
}
[Fact]
public void DeepEqual_ShouldReturnFalse_WhenAnyFieldDiffers() {
// Arrange
var p1 = new Person { Name = "Alice", Age = 30, Addr = new Address { City = "Turin" } };
var p2 = new Person { Name = "Alice", Age = 31, Addr = new Address { City = "Turin" } };
// Act
var equal = p1.DeepEqual(p2);
// Assert
Assert.False(equal);
}
[Fact]
public void DeepEqual_ShouldHandleCycles() {
// Arrange
var p1 = new Person { Name = "R", Addr = new Address { City = "X" } };
p1.Addr!.Owner = p1;
var p2 = new Person { Name = "R", Addr = new Address { City = "X" } };
p2.Addr!.Owner = p2;
// Act
var equal = p1.DeepEqual(p2);
// Assert
Assert.True(equal);
}
[Fact]
public void RevertFrom_ShouldCopyStateBackOntoExistingInstance() {
// Arrange
var original = new Person { Name = "Alice", Age = 20, Addr = new Address { City = "Parma" } };
var snapshot = original.DeepClone();
// Mutate original
original.Name = "Changed";
original.Age = 99;
original.Addr!.City = "ChangedCity";
original.SetSecret("changed-secret");
// Act
original.RevertFrom(snapshot);
// Assert
Assert.Equal("Alice", original.Name);
Assert.Equal(20, original.Age);
Assert.Equal("Parma", original.Addr!.City);
Assert.Equal("xyz", original.Secret);
}
[Fact]
public void DeepEqual_NullsAndTypeMismatch_ShouldBehaveCorrectly() {
// Arrange
Person? a = null;
Person? b = null;
var c = new Person();
var d = new TestObject { Name = "x", Age = 1 };
// Act / Assert
Assert.True(a.DeepEqual(b));
Assert.False(a.DeepEqual(c));
// Different runtime types must be false
Assert.False(c.DeepEqual((object)d));
}
} }
} }

View File

@ -30,9 +30,8 @@ public static class LoggerHelper
serviceCollection.AddLogging(builder => serviceCollection.AddLogging(builder =>
{ {
var env = serviceCollection.BuildServiceProvider().GetRequiredService<IHostEnvironment>();
builder.ClearProviders(); builder.ClearProviders();
builder.AddConsole(env); builder.AddConsoleLogger();
}); });
var provider = serviceCollection.BuildServiceProvider(); var provider = serviceCollection.BuildServiceProvider();

View File

@ -28,7 +28,7 @@ public class FileLoggerTests {
ContentRootPath = Directory.GetCurrentDirectory() ContentRootPath = Directory.GetCurrentDirectory()
}); });
serviceCollection.AddLogging(builder => builder.AddFile(_testFolderPath, TimeSpan.FromDays(7))); serviceCollection.AddLogging(builder => builder.AddFileLogger(_testFolderPath, TimeSpan.FromDays(7)));
var provider = serviceCollection.BuildServiceProvider(); var provider = serviceCollection.BuildServiceProvider();
var logger = provider.GetRequiredService<ILogger<FileLoggerTests>>(); var logger = provider.GetRequiredService<ILogger<FileLoggerTests>>();
@ -55,7 +55,7 @@ public class FileLoggerTests {
ContentRootPath = Directory.GetCurrentDirectory() ContentRootPath = Directory.GetCurrentDirectory()
}); });
serviceCollection.AddLogging(builder => builder.AddFile(_testFolderPath, retentionPeriod)); serviceCollection.AddLogging(builder => builder.AddFileLogger(_testFolderPath, retentionPeriod));
var provider = serviceCollection.BuildServiceProvider(); var provider = serviceCollection.BuildServiceProvider();
var logger = provider.GetRequiredService<ILogger<FileLoggerTests>>(); var logger = provider.GetRequiredService<ILogger<FileLoggerTests>>();
@ -86,7 +86,7 @@ public class FileLoggerTests {
ContentRootPath = Directory.GetCurrentDirectory() ContentRootPath = Directory.GetCurrentDirectory()
}); });
serviceCollection.AddLogging(builder => builder.AddFile(_testFolderPath)); serviceCollection.AddLogging(builder => builder.AddFileLogger(_testFolderPath));
var provider = serviceCollection.BuildServiceProvider(); var provider = serviceCollection.BuildServiceProvider();
var logger = provider.GetRequiredService<ILogger<FileLoggerTests>>(); var logger = provider.GetRequiredService<ILogger<FileLoggerTests>>();

View File

@ -0,0 +1,143 @@
using System.Text.Json;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
using MaksIT.Core.Logging;
namespace MaksIT.Core.Tests.Logging;
public class JsonFileLoggerTests {
private readonly string _testFolderPath;
public JsonFileLoggerTests() {
_testFolderPath = Path.Combine(Path.GetTempPath(), "JsonFileLoggerTests");
if (Directory.Exists(_testFolderPath)) {
Directory.Delete(_testFolderPath, true);
}
Directory.CreateDirectory(_testFolderPath);
}
[Fact]
public void ShouldWriteLogsInJsonFormat() {
// Arrange
var serviceCollection = new ServiceCollection();
serviceCollection.AddSingleton<IHostEnvironment>(sp =>
new TestHostEnvironment {
EnvironmentName = Environments.Development,
ApplicationName = "TestApp",
ContentRootPath = Directory.GetCurrentDirectory()
});
serviceCollection.AddLogging(builder => builder.AddJsonFileLogger(_testFolderPath, TimeSpan.FromDays(7)));
var provider = serviceCollection.BuildServiceProvider();
var logger = provider.GetRequiredService<ILogger<JsonFileLoggerTests>>();
// Act
logger.LogInformation("Test JSON log message");
// Assert
var logFile = Directory.GetFiles(_testFolderPath, "log_*.json").FirstOrDefault();
Assert.NotNull(logFile);
var logContent = File.ReadAllText(logFile);
Assert.Contains("Test JSON log message", logContent);
var logEntry = JsonSerializer.Deserialize<JsonElement>(logContent.TrimEnd(','));
Assert.Equal("Information", logEntry.GetProperty("LogLevel").GetString());
Assert.Equal("Test JSON log message", logEntry.GetProperty("Message").GetString());
}
[Fact]
public void ShouldDeleteOldJsonLogsBasedOnRetention() {
// Arrange
var retentionPeriod = TimeSpan.FromDays(1);
var serviceCollection = new ServiceCollection();
serviceCollection.AddSingleton<IHostEnvironment>(sp =>
new TestHostEnvironment {
EnvironmentName = Environments.Development,
ApplicationName = "TestApp",
ContentRootPath = Directory.GetCurrentDirectory()
});
serviceCollection.AddLogging(builder => builder.AddJsonFileLogger(_testFolderPath, retentionPeriod));
var provider = serviceCollection.BuildServiceProvider();
var logger = provider.GetRequiredService<ILogger<JsonFileLoggerTests>>();
// Create an old log file
var oldLogFile = Path.Combine(_testFolderPath, $"log_{DateTime.Now.AddDays(-2):yyyy-MM-dd}.json");
File.WriteAllText(oldLogFile, "{\"Message\":\"Old log\"}");
// Act
logger.LogInformation("New JSON log message");
// Assert
Assert.False(File.Exists(oldLogFile), "Old JSON log file should have been deleted.");
var logFile = Directory.GetFiles(_testFolderPath, "log_*.json").FirstOrDefault();
Assert.NotNull(logFile);
var logContent = File.ReadAllText(logFile);
Assert.Contains("New JSON log message", logContent);
}
[Fact]
public void ShouldLogExceptionsInJsonFormat() {
// Arrange
var serviceCollection = new ServiceCollection();
serviceCollection.AddSingleton<IHostEnvironment>(sp =>
new TestHostEnvironment {
EnvironmentName = Environments.Development,
ApplicationName = "TestApp",
ContentRootPath = Directory.GetCurrentDirectory()
});
serviceCollection.AddLogging(builder => builder.AddJsonFileLogger(_testFolderPath, TimeSpan.FromDays(7)));
var provider = serviceCollection.BuildServiceProvider();
var logger = provider.GetRequiredService<ILogger<JsonFileLoggerTests>>();
// Act
logger.LogError(new InvalidOperationException("Test exception"), "An error occurred");
// Assert
var logFile = Directory.GetFiles(_testFolderPath, "log_*.json").FirstOrDefault();
Assert.NotNull(logFile);
var logContent = File.ReadAllText(logFile);
Assert.Contains("An error occurred", logContent);
Assert.Contains("Test exception", logContent);
var logEntry = JsonSerializer.Deserialize<JsonElement>(logContent.TrimEnd(','));
Assert.Equal("Error", logEntry.GetProperty("LogLevel").GetString());
Assert.Equal("An error occurred", logEntry.GetProperty("Message").GetString());
Assert.Contains("Test exception", logEntry.GetProperty("Exception").GetString());
}
[Fact]
public void ShouldWorkWithConsoleLogger() {
// Arrange
var serviceCollection = new ServiceCollection();
serviceCollection.AddSingleton<IHostEnvironment>(sp =>
new TestHostEnvironment {
EnvironmentName = Environments.Development,
ApplicationName = "TestApp",
ContentRootPath = Directory.GetCurrentDirectory()
});
serviceCollection.AddLogging(builder => {
builder.AddJsonFileLogger(_testFolderPath, TimeSpan.FromDays(7));
builder.AddSimpleConsoleLogger();
});
var provider = serviceCollection.BuildServiceProvider();
var logger = provider.GetRequiredService<ILogger<JsonFileLoggerTests>>();
// Act
logger.LogInformation("Test combined logging");
// Assert
var logFile = Directory.GetFiles(_testFolderPath, "log_*.json").FirstOrDefault();
Assert.NotNull(logFile);
var logContent = File.ReadAllText(logFile);
Assert.Contains("Test combined logging", logContent);
}
}

View File

@ -0,0 +1,105 @@
using System.Diagnostics;
using MaksIT.Core.Threading;
namespace MaksIT.Core.Tests.Threading;
public class LockManagerTests {
[Fact]
public async Task ShouldEnsureThreadSafety() {
// Arrange
var lockManager = new LockManager();
int counter = 0;
// Act
var tasks = Enumerable.Range(0, 10).Select(_ => lockManager.ExecuteWithLockAsync(async () => {
int temp = counter;
await Task.Delay(10); // Simulate work
counter = temp + 1;
}));
await Task.WhenAll(tasks);
// Assert
Assert.Equal(10, counter);
}
[Fact]
public async Task ShouldEnforceRateLimiting() {
// Arrange
var lockManager = new LockManager();
var stopwatch = Stopwatch.StartNew();
// Act
var tasks = Enumerable.Range(0, 10).Select(_ => lockManager.ExecuteWithLockAsync(async () => {
await Task.Delay(10); // Simulate work
}));
await Task.WhenAll(tasks);
stopwatch.Stop();
// With 1 token and 200ms replenishment:
// first task starts immediately, remaining 9 wait ~9 * 200ms = ~1800ms + overhead.
// Allow some jitter on CI.
Assert.InRange(stopwatch.ElapsedMilliseconds, 1700, 6000);
}
[Fact]
public async Task ShouldAllowReentrantLocks() {
// Arrange
var lockManager = new LockManager();
int counter = 0;
// Act
await lockManager.ExecuteWithLockAsync(async () => {
await lockManager.ExecuteWithLockAsync(() => {
counter++;
return Task.CompletedTask;
});
});
// Assert
Assert.Equal(1, counter);
}
[Fact]
public async Task ShouldHandleExceptionsGracefully() {
// Arrange
var lockManager = new LockManager();
int counter = 0;
// Act & Assert
await Assert.ThrowsAsync<InvalidOperationException>(async () => {
await lockManager.ExecuteWithLockAsync(async () => {
counter++;
throw new InvalidOperationException("Test exception");
});
});
// Ensure lock is not in an inconsistent state
await lockManager.ExecuteWithLockAsync(() => {
counter++;
return Task.CompletedTask;
});
Assert.Equal(2, counter);
}
[Fact]
public async Task ShouldSupportConcurrentAccess() {
// Arrange
var lockManager = new LockManager();
int counter = 0;
// Act
var tasks = Enumerable.Range(0, 100).Select(_ => lockManager.ExecuteWithLockAsync(async () => {
int temp = counter;
await Task.Delay(1); // Simulate work
counter = temp + 1;
}));
await Task.WhenAll(tasks);
// Assert
Assert.Equal(100, counter);
}
}

View File

@ -1,7 +1,11 @@
using System.Text.Json; using System.Reflection;
using System.Runtime.CompilerServices;
using System.Runtime.Serialization;
using System.Text.Json;
using System.Text.Json.Serialization; using System.Text.Json.Serialization;
namespace MaksIT.Core.Extensions;
namespace MaksIT.Core.Extensions;
public static class ObjectExtensions { public static class ObjectExtensions {
@ -33,4 +37,232 @@ public static class ObjectExtensions {
return JsonSerializer.Serialize(obj, options); return JsonSerializer.Serialize(obj, options);
} }
/// <summary>
/// Creates a deep clone of the object, preserving reference identity and supporting cycles.
/// </summary>
public static T DeepClone<T>(this T source) {
return (T)DeepCloneInternal(source, new Dictionary<object, object>(ReferenceEqualityComparer.Instance));
}
/// <summary>
/// Deeply compares two objects for structural equality (fields, including private ones).
/// </summary>
public static bool DeepEqual<T>(this T a, T b) {
return DeepEqualInternal(a, b, new HashSet<(object, object)>(ReferencePairComparer.Instance));
}
/// <summary>
/// Copies all fields from the snapshot into the current target object (useful with tracked entities).
/// </summary>
public static void RevertFrom<T>(this T target, T snapshot) {
if (ReferenceEquals(target, snapshot) || target == null || snapshot == null) return;
var visited = new Dictionary<object, object>(ReferenceEqualityComparer.Instance);
CopyAllFields(snapshot!, target!, snapshot!.GetType(), visited);
}
#region Internal Cloner
private static object DeepCloneInternal(object source, Dictionary<object, object> visited) {
if (source == null) return null!;
var type = source.GetType();
// Fast-path for immutable/primitive-ish types
if (IsImmutable(type)) return source;
// Already cloned?
if (!type.IsValueType && visited.TryGetValue(source, out var existing))
return existing;
// Arrays
if (type.IsArray)
return CloneArray((Array)source, visited);
// Value types (structs): shallow copy via boxing + clone ref-type fields
if (type.IsValueType)
return CloneStruct(source, type, visited);
// Reference type: allocate uninitialized object, then copy fields
var clone = FormatterServices.GetUninitializedObject(type);
visited[source] = clone;
CopyAllFields(source, clone, type, visited);
return clone;
}
private static bool IsImmutable(Type t) {
if (t.IsPrimitive || t.IsEnum) return true;
// Common immutable BCL types
if (t == typeof(string) ||
t == typeof(decimal) ||
t == typeof(DateTime) ||
t == typeof(DateTimeOffset) ||
t == typeof(TimeSpan) ||
t == typeof(Guid) ||
t == typeof(Uri))
return true;
// Nullable<T> of immutable underlying
if (Nullable.GetUnderlyingType(t) is Type nt)
return IsImmutable(nt);
return false;
}
private static Array CloneArray(Array source, Dictionary<object, object> visited) {
var elemType = source.GetType().GetElementType()!;
var rank = source.Rank;
var lengths = new int[rank];
var lowers = new int[rank];
for (int d = 0; d < rank; d++) {
lengths[d] = source.GetLength(d);
lowers[d] = source.GetLowerBound(d);
}
var clone = Array.CreateInstance(elemType, lengths, lowers);
visited[source] = clone;
var indices = new int[rank];
CopyArrayRecursive(source, clone, 0, indices, lowers, lengths, visited);
return clone;
}
private static void CopyArrayRecursive(
Array source,
Array target,
int dim,
int[] indices,
int[] lowers,
int[] lengths,
Dictionary<object, object> visited) {
if (dim == source.Rank) {
var value = source.GetValue(indices);
var cloned = DeepCloneInternal(value!, visited);
target.SetValue(cloned, indices);
return;
}
int start = lowers[dim];
int end = lowers[dim] + lengths[dim];
for (int i = start; i < end; i++) {
indices[dim] = i;
CopyArrayRecursive(source, target, dim + 1, indices, lowers, lengths, visited);
}
}
private static object CloneStruct(object source, Type type, Dictionary<object, object> visited) {
// Boxed copy is already a shallow copy of the struct
object boxed = source;
CopyAllFields(boxed, boxed, type, visited, skipVisitedRegistration: true);
return boxed;
}
private static void CopyAllFields(object source, object target, Type type, Dictionary<object, object> visited, bool skipVisitedRegistration = false) {
for (Type t = type; t != null; t = t.BaseType!) {
var fields = t.GetFields(BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.Public | BindingFlags.DeclaredOnly);
foreach (var f in fields) {
var value = f.GetValue(source);
var cloned = DeepCloneInternal(value!, visited);
f.SetValue(target, cloned);
}
}
}
#endregion
#region Internal Deep-Equal
private static bool DeepEqualInternal(object a, object b, HashSet<(object, object)> visited) {
if (ReferenceEquals(a, b))
return true;
if (a == null || b == null)
return false;
var type = a.GetType();
if (type != b.GetType())
return false;
// Fast path for immutables
if (IsImmutable(type))
return a.Equals(b);
// Prevent infinite recursion on cycles
var pair = (a, b);
if (visited.Contains(pair))
return true;
visited.Add(pair);
// Arrays
if (type.IsArray)
return ArraysEqual((Array)a, (Array)b, visited);
// Value or reference types: compare fields recursively
return FieldsEqual(a, b, type, visited);
}
private static bool ArraysEqual(Array a, Array b, HashSet<(object, object)> visited) {
if (a.Rank != b.Rank) return false;
for (int d = 0; d < a.Rank; d++) {
if (a.GetLength(d) != b.GetLength(d) || a.GetLowerBound(d) != b.GetLowerBound(d))
return false;
}
var indices = new int[a.Rank];
return CompareArrayRecursive(a, b, 0, indices, visited);
}
private static bool CompareArrayRecursive(Array a, Array b, int dim, int[] indices, HashSet<(object, object)> visited) {
if (dim == a.Rank) {
var va = a.GetValue(indices);
var vb = b.GetValue(indices);
return DeepEqualInternal(va!, vb!, visited);
}
int start = a.GetLowerBound(dim);
int end = start + a.GetLength(dim);
for (int i = start; i < end; i++) {
indices[dim] = i;
if (!CompareArrayRecursive(a, b, dim + 1, indices, visited))
return false;
}
return true;
}
private static bool FieldsEqual(object a, object b, Type type, HashSet<(object, object)> visited) {
for (Type t = type; t != null; t = t.BaseType!) {
var fields = t.GetFields(BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.Public | BindingFlags.DeclaredOnly);
foreach (var f in fields) {
var va = f.GetValue(a);
var vb = f.GetValue(b);
if (!DeepEqualInternal(va!, vb!, visited))
return false;
}
}
return true;
}
#endregion
#region Helpers
private sealed class ReferenceEqualityComparer : IEqualityComparer<object> {
public static readonly ReferenceEqualityComparer Instance = new ReferenceEqualityComparer();
public new bool Equals(object x, object y) => ReferenceEquals(x, y);
public int GetHashCode(object obj) => RuntimeHelpers.GetHashCode(obj);
}
private sealed class ReferencePairComparer : IEqualityComparer<(object, object)> {
public static readonly ReferencePairComparer Instance = new ReferencePairComparer();
public bool Equals((object, object) x, (object, object) y)
=> ReferenceEquals(x.Item1, y.Item1) && ReferenceEquals(x.Item2, y.Item2);
public int GetHashCode((object, object) obj) {
unchecked {
return (RuntimeHelpers.GetHashCode(obj.Item1) * 397) ^ RuntimeHelpers.GetHashCode(obj.Item2);
}
}
}
#endregion
} }

View File

@ -0,0 +1,57 @@
using System;
using System.IO;
using MaksIT.Core.Threading;
using Microsoft.Extensions.Logging;
namespace MaksIT.Core.Logging;
public abstract class BaseFileLogger : ILogger, IDisposable {
private readonly LockManager _lockManager = new LockManager();
private readonly string _folderPath;
private readonly TimeSpan _retentionPeriod;
protected BaseFileLogger(string folderPath, TimeSpan retentionPeriod) {
_folderPath = folderPath;
_retentionPeriod = retentionPeriod;
Directory.CreateDirectory(_folderPath); // Ensure the folder exists
}
public IDisposable? BeginScope<TState>(TState state) where TState : notnull => null;
public bool IsEnabled(LogLevel logLevel) {
return logLevel != LogLevel.None;
}
public abstract void Log<TState>(LogLevel logLevel, EventId eventId, TState state, Exception? exception, Func<TState, Exception?, string> formatter);
protected string GenerateLogFileName(string extension) {
return Path.Combine(_folderPath, $"log_{DateTime.UtcNow:yyyy-MM-dd}.{extension}");
}
protected void AppendToLogFile(string logFileName, string content) {
_lockManager.ExecuteWithLockAsync(async () => {
await File.AppendAllTextAsync(logFileName, content);
RemoveExpiredLogFiles(Path.GetExtension(logFileName));
}).Wait();
}
private void RemoveExpiredLogFiles(string extension) {
var filePattern = $"log_*.{extension.TrimStart('.')}";
var logFiles = Directory.GetFiles(_folderPath, filePattern);
var expirationDate = DateTime.UtcNow - _retentionPeriod;
foreach (var logFile in logFiles) {
var fileName = Path.GetFileNameWithoutExtension(logFile);
if (DateTime.TryParseExact(fileName.Substring(4), "yyyy-MM-dd", null, System.Globalization.DateTimeStyles.None, out var logDate)) {
if (logDate < expirationDate) {
File.Delete(logFile);
}
}
}
}
public void Dispose() {
_lockManager.Dispose();
}
}

View File

@ -1,26 +1,11 @@
using Microsoft.Extensions.Logging; using Microsoft.Extensions.Logging;
using System.IO;
namespace MaksIT.Core.Logging; namespace MaksIT.Core.Logging;
public class FileLogger : ILogger { public class FileLogger : BaseFileLogger {
private readonly string _folderPath; public FileLogger(string folderPath, TimeSpan retentionPeriod) : base(folderPath, retentionPeriod) { }
private readonly object _lock = new object();
private readonly TimeSpan _retentionPeriod;
public FileLogger(string folderPath, TimeSpan retentionPeriod) { public override void Log<TState>(LogLevel logLevel, EventId eventId, TState state, Exception? exception, Func<TState, Exception?, string> formatter) {
_folderPath = folderPath;
_retentionPeriod = retentionPeriod;
Directory.CreateDirectory(_folderPath); // Ensure the folder exists
}
public IDisposable? BeginScope<TState>(TState state) where TState : notnull => null;
public bool IsEnabled(LogLevel logLevel) {
return logLevel != LogLevel.None;
}
public void Log<TState>(LogLevel logLevel, EventId eventId, TState state, Exception? exception, Func<TState, Exception?, string> formatter) {
if (!IsEnabled(logLevel)) if (!IsEnabled(logLevel))
return; return;
@ -28,30 +13,12 @@ public class FileLogger : ILogger {
if (string.IsNullOrEmpty(message)) if (string.IsNullOrEmpty(message))
return; return;
var logRecord = $"{DateTime.Now:yyyy-MM-dd HH:mm:ss} [{logLevel}] {message}"; var logRecord = $"{DateTime.UtcNow.ToString("o")} [{logLevel}] {message}";
if (exception != null) { if (exception != null) {
logRecord += Environment.NewLine + exception; logRecord += Environment.NewLine + exception;
} }
var logFileName = Path.Combine(_folderPath, $"log_{DateTime.Now:yyyy-MM-dd}.txt"); // Generate log file name by date var logFileName = GenerateLogFileName("txt");
AppendToLogFile(logFileName, logRecord + Environment.NewLine);
lock (_lock) {
File.AppendAllText(logFileName, logRecord + Environment.NewLine);
CleanUpOldLogs();
}
}
private void CleanUpOldLogs() {
var logFiles = Directory.GetFiles(_folderPath, "log_*.txt");
var expirationDate = DateTime.Now - _retentionPeriod;
foreach (var logFile in logFiles) {
var fileName = Path.GetFileNameWithoutExtension(logFile);
if (DateTime.TryParseExact(fileName.Substring(4), "yyyy-MM-dd", null, System.Globalization.DateTimeStyles.None, out var logDate)) {
if (logDate < expirationDate) {
File.Delete(logFile);
}
}
}
} }
} }

View File

@ -0,0 +1,23 @@
using Microsoft.Extensions.Logging;
using System.Text.Json;
namespace MaksIT.Core.Logging;
public class JsonFileLogger : BaseFileLogger {
public JsonFileLogger(string folderPath, TimeSpan retentionPeriod) : base(folderPath, retentionPeriod) { }
public override void Log<TState>(LogLevel logLevel, EventId eventId, TState state, Exception? exception, Func<TState, Exception?, string> formatter) {
if (!IsEnabled(logLevel))
return;
var logEntry = new {
Timestamp = DateTime.UtcNow.ToString("o"),
LogLevel = logLevel.ToString(),
Message = formatter(state, exception),
Exception = exception?.ToString()
};
var logFileName = GenerateLogFileName("json");
AppendToLogFile(logFileName, JsonSerializer.Serialize(logEntry) + Environment.NewLine);
}
}

View File

@ -0,0 +1,20 @@
using Microsoft.Extensions.Logging;
namespace MaksIT.Core.Logging;
[ProviderAlias("JsonFileLogger")]
public class JsonFileLoggerProvider : ILoggerProvider {
private readonly string _folderPath;
private readonly TimeSpan _retentionPeriod;
public JsonFileLoggerProvider(string folderPath, TimeSpan? retentionPeriod = null) {
_folderPath = folderPath ?? throw new ArgumentNullException(nameof(folderPath));
_retentionPeriod = retentionPeriod ?? TimeSpan.FromDays(7); // Default retention period is 7 days
}
public ILogger CreateLogger(string categoryName) {
return new JsonFileLogger(_folderPath, _retentionPeriod);
}
public void Dispose() { }
}

View File

@ -5,25 +5,48 @@ using Microsoft.Extensions.Hosting;
namespace MaksIT.Core.Logging; namespace MaksIT.Core.Logging;
public static class LoggingBuilderExtensions { public static class LoggingBuilderExtensions {
public static ILoggingBuilder AddFile(this ILoggingBuilder builder, string folderPath, TimeSpan? retentionPeriod = null) { public static ILoggingBuilder AddFileLogger(this ILoggingBuilder logging, string folderPath, TimeSpan? retentionPeriod = null) {
builder.Services.AddSingleton<ILoggerProvider>(new FileLoggerProvider(folderPath, retentionPeriod)); logging.Services.AddSingleton<ILoggerProvider>(new FileLoggerProvider(folderPath, retentionPeriod));
return builder; return logging;
} }
public static ILoggingBuilder AddConsole(this ILoggingBuilder logging, IHostEnvironment env) {
public static ILoggingBuilder AddJsonFileLogger(this ILoggingBuilder logging, string folderPath, TimeSpan? retentionPeriod = null) {
logging.Services.AddSingleton<ILoggerProvider>(new JsonFileLoggerProvider(folderPath, retentionPeriod));
return logging;
}
public static ILoggingBuilder AddSimpleConsoleLogger(this ILoggingBuilder logging) {
logging.AddSimpleConsole(options => {
options.IncludeScopes = true;
options.SingleLine = false;
options.TimestampFormat = "yyyy-MM-ddTHH:mm:ss.fffZ";
});
return logging;
}
public static ILoggingBuilder AddJsonConsoleLogger(this ILoggingBuilder logging) {
logging.AddJsonConsole(options => {
options.IncludeScopes = true;
options.TimestampFormat = "yyyy-MM-ddTHH:mm:ss.fffZ";
});
return logging;
}
public static ILoggingBuilder AddConsoleLogger(this ILoggingBuilder logging, string? fileLoggerPath = null) {
logging.ClearProviders(); logging.ClearProviders();
if (env.IsDevelopment()) { logging.AddSimpleConsoleLogger();
logging.AddSimpleConsole(options => {
options.IncludeScopes = true; if (fileLoggerPath != null)
options.SingleLine = false; logging.AddFileLogger(fileLoggerPath);
options.TimestampFormat = "hh:mm:ss ";
}); return logging;
} }
else {
logging.AddJsonConsole(options => { public static ILoggingBuilder AddJsonConsoleLogger(this ILoggingBuilder logging, string? fileLoggerPath = null) {
options.IncludeScopes = true; logging.ClearProviders();
options.TimestampFormat = "yyyy-MM-ddTHH:mm:ss.fffZ"; logging.AddJsonConsoleLogger();
}); if (fileLoggerPath != null)
} logging.AddJsonFileLogger(fileLoggerPath);
return logging; return logging;
} }
} }

View File

@ -8,7 +8,7 @@
<!-- NuGet package metadata --> <!-- NuGet package metadata -->
<PackageId>MaksIT.Core</PackageId> <PackageId>MaksIT.Core</PackageId>
<Version>1.5.0</Version> <Version>1.5.1</Version>
<Authors>Maksym Sadovnychyy</Authors> <Authors>Maksym Sadovnychyy</Authors>
<Company>MAKS-IT</Company> <Company>MAKS-IT</Company>
<Product>MaksIT.Core</Product> <Product>MaksIT.Core</Product>
@ -35,5 +35,6 @@
<PackageReference Include="Microsoft.IdentityModel.Tokens" Version="8.14.0" /> <PackageReference Include="Microsoft.IdentityModel.Tokens" Version="8.14.0" />
<PackageReference Include="System.IdentityModel.Tokens.Jwt" Version="8.14.0" /> <PackageReference Include="System.IdentityModel.Tokens.Jwt" Version="8.14.0" />
<PackageReference Include="System.Linq.Dynamic.Core" Version="1.6.9" /> <PackageReference Include="System.Linq.Dynamic.Core" Version="1.6.9" />
<PackageReference Include="System.Threading.RateLimiting" Version="9.0.10" />
</ItemGroup> </ItemGroup>
</Project> </Project>

View File

@ -0,0 +1,74 @@
using System.Collections.Concurrent;
using System.Threading.RateLimiting;
namespace MaksIT.Core.Threading;
public class LockManager : IDisposable {
private readonly SemaphoreSlim _semaphore = new SemaphoreSlim(1, 1);
// Use AsyncLocal to track reentrancy in the same async flow
private static readonly AsyncLocal<int> _reentrancyDepth = new AsyncLocal<int>();
// Strict limiter: allow 1 token, replenish 1 every 200ms
private readonly TokenBucketRateLimiter _rateLimiter = new TokenBucketRateLimiter(new TokenBucketRateLimiterOptions {
TokenLimit = 1, // Single concurrent entry
QueueProcessingOrder = QueueProcessingOrder.OldestFirst,
QueueLimit = 1_000,
ReplenishmentPeriod = TimeSpan.FromMilliseconds(200), // 1 token every 200ms
TokensPerPeriod = 1,
AutoReplenishment = true
});
public async Task<T> ExecuteWithLockAsync<T>(Func<Task<T>> action) {
var lease = await _rateLimiter.AcquireAsync(1);
if (!lease.IsAcquired) throw new InvalidOperationException("Rate limit exceeded");
// Determine if this is the first entry for the current async flow
bool isFirstEntry = false;
if (_reentrancyDepth.Value == 0) {
isFirstEntry = true;
_reentrancyDepth.Value = 1;
}
else {
_reentrancyDepth.Value = _reentrancyDepth.Value + 1;
}
if (isFirstEntry) await _semaphore.WaitAsync();
try {
return await action();
}
finally {
// Decrement reentrancy; release semaphore only when depth reaches zero
var newDepth = _reentrancyDepth.Value - 1;
_reentrancyDepth.Value = newDepth < 0 ? 0 : newDepth;
if (isFirstEntry) _semaphore.Release();
// Dispose the lease to complete the rate-limited window
lease.Dispose();
}
}
public async Task ExecuteWithLockAsync(Func<Task> action) {
await ExecuteWithLockAsync(async () => {
await action();
return true;
});
}
public async Task<T> ExecuteWithLockAsync<T>(Func<T> action) {
return await ExecuteWithLockAsync(() => Task.FromResult(action()));
}
public async Task ExecuteWithLockAsync(Action action) {
await ExecuteWithLockAsync(() => {
action();
return Task.CompletedTask;
});
}
public void Dispose() {
_semaphore.Dispose();
_rateLimiter.Dispose();
}
}