(feature): base file logger prefix, bugfixing, improved tests coverage

This commit is contained in:
Maksym Sadovnychyy 2026-01-31 18:03:29 +01:00
parent 016904f333
commit 7c8d00bcda
31 changed files with 5340 additions and 131 deletions

42
CHANGELOG.md Normal file
View File

@ -0,0 +1,42 @@
# Changelog
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## v1.6.1
### Added
- Added `CreateMutex` method to `BaseFileLogger`
- Added `ResolveFolderPath` and `SanitizeForPath` methods to `FileLoggerProvider`
- Added `ResolveFolderPath` and `SanitizeForPath` methods to `JsonFileLoggerProvider`
- Added `LoggerPrefix` class for managing logger prefixes
- AI assisted CHANGELOG.md generation
### Changed
- Improved error handling in `BaseFileLogger`
<!--
Template for new releases:
## v1.x.x
### Added
- New features
### Changed
- Changes in existing functionality
### Deprecated
- Soon-to-be removed features
### Removed
- Removed features
### Fixed
- Bug fixes
### Security
- Security improvements
-->

168
CONTRIBUTING.md Normal file
View File

@ -0,0 +1,168 @@
# Contributing to MaksIT.Core
Thank you for your interest in contributing to MaksIT.Core! This document provides guidelines for contributing to the project.
## Getting Started
1. Fork the repository
2. Clone your fork locally
3. Create a new branch for your changes
4. Make your changes
5. Submit a pull request
## Development Setup
### Prerequisites
- .NET10 SDK or later
- Git
### Building the Project
```bash
cd src
dotnet build MaksIT.Core.sln
```
### Running Tests
```bash
cd src
dotnet test MaksIT.Core.Tests
```
## Commit Message Format
This project uses the following commit message format:
```
(type): description
```
### Commit Types
| Type | Description |
|------|-------------|
| `(feature):` | New feature or enhancement |
| `(bugfix):` | Bug fix |
| `(refactor):` | Code refactoring without functional changes |
| `(chore):` | Maintenance tasks (dependencies, CI, documentation) |
### Examples
```
(feature): add support for custom JWT claims
(bugfix): fix multithreading issue in file logger
(refactor): simplify expression extension methods
(chore): update copyright year to 2026
```
### Guidelines
- Use lowercase for the description
- Keep the description concise but descriptive
- No period at the end of the description
## Code Style
- Follow standard C# naming conventions
- Use XML documentation comments for public APIs
- Keep methods focused and single-purpose
- Write unit tests for new functionality
## Pull Request Process
1. Ensure all tests pass
2. Update documentation if needed
3. Update CHANGELOG.md with your changes under the appropriate version section
4. Submit your pull request against the `main` branch
## Versioning
This project follows [Semantic Versioning](https://semver.org/):
- **MAJOR** - Breaking changes
- **MINOR** - New features (backward compatible)
- **PATCH** - Bug fixes (backward compatible)
## Release Process
The release process is automated via PowerShell scripts in the `src/` directory.
### Prerequisites
- Docker Desktop running (for Linux tests)
- GitHub CLI (`gh`) authenticated
- NuGet API key in `NUGET_API_KEY` environment variable
- GitHub token in `GITHUB_MAKS_IT_COM` environment variable
### Release Workflow
1. **Update version** in `MaksIT.Core/MaksIT.Core.csproj`
2. **Generate changelog** (uses AI with Ollama if available):
```powershell
cd src
.\Generate-Changelog.ps1 # Updates CHANGELOG.md and LICENSE.md year
.\Generate-Changelog.ps1 -DryRun # Preview without changes
```
3. **Review and commit** all changes:
```bash
git add -A
git commit -m "(chore): release v1.x.x"
```
4. **Create version tag**:
```bash
git tag v1.x.x
```
5. **Run release script**:
```powershell
cd src
.\Release-NuGetPackage.ps1 # Full release
.\Release-NuGetPackage.ps1 -DryRun # Test without publishing
```
### How Release Works
The release script:
1. **Reads latest version** from `CHANGELOG.md`
2. **Finds the commit** with the matching version tag (e.g., `v1.2.3`)
3. **Checks if already released** on NuGet.org - skips if yes
4. **Builds and tests** the tagged commit
5. **Publishes** to NuGet and GitHub
You can run the release script from any branch or commit - it will always release the commit that has the version tag matching the latest changelog entry.
### Release Script Validation
- **Version source**: Reads latest version from `CHANGELOG.md`
- **Tag required**: Must have a tag matching the changelog version
- **Branch validation**: Tag must be on configured branch (default: `main`, set in `scriptsettings.json`)
- **Already released**: Skips if version exists on NuGet.org
- **Clean working directory**: No uncommitted changes allowed
### What the Release Script Does
1. Validates prerequisites and environment
2. Runs security vulnerability scan
3. Builds and tests on Windows
4. Builds and tests on Linux (via Docker)
5. Analyzes code coverage
6. Creates NuGet package
7. Pushes to NuGet.org
8. Creates GitHub release with assets
### Re-releasing
To re-release the same version (e.g., to fix release assets):
- Keep the same tag on the same commit
- Run the release script again
- It will delete the existing GitHub release and recreate it
## License
By contributing, you agree that your contributions will be licensed under the MIT License.

View File

@ -1,6 +1,6 @@
MIT License
Copyright (c) 2024 - 2025 Maksym Sadovnychyy (MAKS-IT)
Copyright (c) 2024 - 2026 Maksym Sadovnychyy (MAKS-IT)
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal

296
README.md
View File

@ -1,4 +1,4 @@
# MaksIT.Core Library Documentation
# MaksIT.Core Library Documentation
## Table of Contents
@ -16,6 +16,7 @@
- [Logging](#logging)
- [File Logger](#file-logger)
- [JSON File Logger](#json-file-logger)
- [Logger Prefix](#logger-prefix)
- [Threading](#threading)
- [Lock Manager](#lock-manager)
- [Networking](#networking)
@ -31,7 +32,10 @@
- [JWK Thumbprint Utility](#jwk-thumbprint-utility)
- [JWS Generator](#jws-generator)
- [TOTP Generator](#totp-generator)
- [Web API Models](#web-api-models)
- [Web API](#web-api)
- [Paged Request](#paged-request)
- [Paged Response](#paged-response)
- [Patch Operation](#patch-operation)
- [Sagas](#sagas)
- [CombGuidGenerator](#combguidgenerator)
- [Others](#others)
@ -664,6 +668,9 @@ The `FileLogger` class in the `MaksIT.Core.Logging` namespace provides a simple
3. **Thread Safety**:
- Ensures safe concurrent writes to the log file using the `LockManager`.
4. **Folder-Based Logging**:
- Organize logs into subfolders using the `LoggerPrefix` feature.
#### Example Usage
```csharp
@ -691,6 +698,9 @@ The `JsonFileLogger` class in the `MaksIT.Core.Logging` namespace provides struc
3. **Thread Safety**:
- Ensures safe concurrent writes to the log file using the `LockManager`.
4. **Folder-Based Logging**:
- Organize logs into subfolders using the `LoggerPrefix` feature.
#### Example Usage
```csharp
@ -703,6 +713,92 @@ logger.LogInformation("Logging to JSON file!");
---
### Logger Prefix
The `LoggerPrefix` class in the `MaksIT.Core.Logging` namespace provides a type-safe way to specify logger categories with special prefixes. It extends the `Enumeration` base class and enables organizing logs into subfolders or applying custom categorization without using magic strings.
#### Features
1. **Type-Safe Prefixes**:
- Avoid magic strings by using strongly-typed prefix constants.
2. **Folder-Based Organization**:
- Use `LoggerPrefix.Folder` to write logs to specific subfolders.
3. **Extensible Categories**:
- Additional prefixes like `LoggerPrefix.Category` and `LoggerPrefix.Tag` are available for future use.
4. **Automatic Parsing**:
- Parse category names to extract prefix and value using `LoggerPrefix.Parse()`.
5. **Backward Compatible**:
- Standard `ILogger<T>` usage remains unchanged; prefixes are only applied when explicitly used.
#### Available Prefixes
| Prefix | Purpose |
|--------|---------|
| `LoggerPrefix.Folder` | Writes logs to a subfolder with the specified name |
| `LoggerPrefix.Category` | Reserved for categorization (future use) |
| `LoggerPrefix.Tag` | Reserved for tagging (future use) |
#### Example Usage
##### Creating a Logger with a Folder Prefix
```csharp
var services = new ServiceCollection();
services.AddLogging(builder => builder.AddFileLogger("logs", TimeSpan.FromDays(7)));
var provider = services.BuildServiceProvider();
var loggerFactory = provider.GetRequiredService<ILoggerFactory>();
// Create a logger that writes to logs/Audit/log_yyyy-MM-dd.txt
var auditLogger = loggerFactory.CreateLogger(LoggerPrefix.Folder.WithValue("Audit"));
auditLogger.LogInformation("Audit event occurred");
// Create a logger that writes to logs/Orders/log_yyyy-MM-dd.txt
var ordersLogger = loggerFactory.CreateLogger(LoggerPrefix.Folder.WithValue("Orders"));
ordersLogger.LogInformation("Order processed");
```
##### Standard ILogger<T> Usage (Unchanged)
```csharp
// Standard usage - logs go to the default folder (logs/log_yyyy-MM-dd.txt)
var logger = provider.GetRequiredService<ILogger<MyService>>();
logger.LogInformation("Standard log message");
```
##### Parsing a Category Name
```csharp
var categoryName = "Folder:Audit";
var (prefix, value) = LoggerPrefix.Parse(categoryName);
if (prefix == LoggerPrefix.Folder) {
Console.WriteLine($"Folder: {value}"); // Output: Folder: Audit
}
```
#### Result
| Logger Creation | Log File Location |
|-----------------|-------------------|
| `ILogger<MyService>` | `logs/log_2026-01-30.txt` |
| `CreateLogger(LoggerPrefix.Folder.WithValue("Audit"))` | `logs/Audit/log_2026-01-30.txt` |
| `CreateLogger(LoggerPrefix.Folder.WithValue("Orders"))` | `logs/Orders/log_2026-01-30.txt` |
#### Best Practices
1. **Use Type-Safe Prefixes**:
- Always use `LoggerPrefix.Folder.WithValue()` instead of raw strings like `"Folder:Audit"`.
2. **Organize by Domain**:
- Use meaningful folder names to organize logs by domain (e.g., "Audit", "Orders", "Security").
3. **Keep Default Logging Simple**:
- Use standard `ILogger<T>` for general application logging and folder prefixes for specialized logs.
---
## Threading
### Lock Manager
@ -1123,7 +1219,7 @@ using System.Security.Cryptography;
using MaksIT.Core.Security.JWK;
using var rsa = RSA.Create(2048);
JwkGenerator.TryGenerateFromRCA(rsa, out var jwk, out var errorMessage);
JwkGenerator.TryGenerateFromRSA(rsa, out var jwk, out var errorMessage);
var result = JwkThumbprintUtility.TryGetSha256Thumbprint(jwk!, out var thumbprint, out var error);
if (result)
{
@ -1147,7 +1243,6 @@ else
{
Console.WriteLine($"Error: {error}");
}
}
```
---
@ -1182,6 +1277,199 @@ public static bool TryGetKeyAuthorization(
---
### TOTP Generator
The `TotpGenerator` class in the `MaksIT.Core.Security` namespace provides methods for generating and validating Time-based One-Time Passwords (TOTP) for two-factor authentication.
---
#### Features
1. **TOTP Validation**:
- Validate TOTP codes against a shared secret with configurable time tolerance.
2. **TOTP Generation**:
- Generate TOTP codes from a Base32-encoded secret.
3. **Secret Generation**:
- Generate cryptographically secure Base32 secrets for TOTP setup.
4. **Recovery Codes**:
- Generate backup recovery codes for account recovery.
5. **Auth Link Generation**:
- Generate `otpauth://` URIs for QR code scanning in authenticator apps.
---
#### Example Usage
##### Generating a Secret
```csharp
TotpGenerator.TryGenerateSecret(out var secret, out var error);
// secret is a Base32-encoded string for use with authenticator apps
```
##### Validating a TOTP Code
```csharp
var timeTolerance = 1; // Allow 1 time step before/after current
TotpGenerator.TryValidate(totpCode, secret, timeTolerance, out var isValid, out var error);
if (isValid) {
Console.WriteLine("TOTP is valid");
}
```
##### Generating Recovery Codes
```csharp
TotpGenerator.TryGenerateRecoveryCodes(10, out var recoveryCodes, out var error);
// recoveryCodes contains 10 codes in format "XXXX-XXXX"
```
##### Generating an Auth Link for QR Code
```csharp
TotpGenerator.TryGenerateTotpAuthLink(
"MyApp",
"user@example.com",
secret,
"MyApp",
null, // algorithm (default SHA1)
null, // digits (default 6)
null, // period (default 30)
out var authLink,
out var error
);
// authLink = "otpauth://totp/MyApp:user@example.com?secret=...&issuer=MyApp"
```
---
## Web API
The `Webapi` namespace provides models and utilities for building Web APIs, including pagination support and patch operations.
---
### Paged Request
The `PagedRequest` class in the `MaksIT.Core.Webapi.Models` namespace provides a base class for paginated API requests with filtering and sorting capabilities.
#### Features
1. **Pagination**:
- Configure page size and page number for paginated results.
2. **Dynamic Filtering**:
- Build filter expressions from string-based filter queries.
3. **Dynamic Sorting**:
- Build sort expressions with ascending/descending order.
#### Properties
| Property | Type | Default | Description |
|----------|------|---------|-------------|
| `PageSize` | `int` | `100` | Number of items per page |
| `PageNumber` | `int` | `1` | Current page number |
| `Filters` | `string?` | `null` | Filter expression string |
| `SortBy` | `string?` | `null` | Property name to sort by |
| `IsAscending` | `bool` | `true` | Sort direction |
#### Example Usage
```csharp
var request = new PagedRequest {
PageSize = 20,
PageNumber = 1,
Filters = "Name.Contains(\"John\") && Age > 18",
SortBy = "Name",
IsAscending = true
};
var filterExpression = request.BuildFilterExpression<User>();
var sortExpression = request.BuildSortExpression<User>();
var results = dbContext.Users
.Where(filterExpression)
.OrderBy(sortExpression)
.Skip((request.PageNumber - 1) * request.PageSize)
.Take(request.PageSize)
.ToList();
```
---
### Paged Response
The `PagedResponse<T>` class in the `MaksIT.Core.Webapi.Models` namespace provides a generic wrapper for paginated API responses.
#### Properties
| Property | Type | Description |
|----------|------|-------------|
| `Items` | `IEnumerable<T>` | The items for the current page |
| `PageNumber` | `int` | Current page number |
| `PageSize` | `int` | Number of items per page |
| `TotalCount` | `int` | Total number of items across all pages |
| `TotalPages` | `int` | Calculated total number of pages |
| `HasPreviousPage` | `bool` | Whether a previous page exists |
| `HasNextPage` | `bool` | Whether a next page exists |
#### Example Usage
```csharp
var items = await dbContext.Users
.Skip((pageNumber - 1) * pageSize)
.Take(pageSize)
.ToListAsync();
var totalCount = await dbContext.Users.CountAsync();
var response = new PagedResponse<UserDto>(items, totalCount, pageNumber, pageSize);
// response.TotalPages, response.HasNextPage, etc. are automatically calculated
```
---
### Patch Operation
The `PatchOperation` enum in the `MaksIT.Core.Webapi.Models` namespace defines operations for partial updates (PATCH requests).
#### Values
| Value | Description |
|-------|-------------|
| `SetField` | Set or replace a normal field value |
| `RemoveField` | Set a field to null |
| `AddToCollection` | Add an item to a collection property |
| `RemoveFromCollection` | Remove an item from a collection property |
#### Example Usage
```csharp
public class UserPatchRequest : PatchRequestModelBase {
public PatchOperation Operation { get; set; }
public string PropertyName { get; set; }
public object? Value { get; set; }
}
// Example: Set a field
var patch = new UserPatchRequest {
Operation = PatchOperation.SetField,
PropertyName = "Name",
Value = "New Name"
};
// Example: Add to collection
var patch = new UserPatchRequest {
Operation = PatchOperation.AddToCollection,
PropertyName = "Roles",
Value = "Admin"
};
```
---
## Others
### Culture

View File

@ -0,0 +1,107 @@
namespace MaksIT.Core.Tests;
using System.Globalization;
public class CultureTests {
[Fact]
public void TrySet_NullCulture_SetsInvariantCulture() {
// Arrange
string? culture = null;
// Act
var result = Culture.TrySet(culture, out var errorMessage);
// Assert
Assert.True(result);
Assert.Null(errorMessage);
Assert.Equal(CultureInfo.InvariantCulture, Thread.CurrentThread.CurrentCulture);
Assert.Equal(CultureInfo.InvariantCulture, Thread.CurrentThread.CurrentUICulture);
}
[Fact]
public void TrySet_EmptyCulture_SetsInvariantCulture() {
// Arrange
string culture = "";
// Act
var result = Culture.TrySet(culture, out var errorMessage);
// Assert
Assert.True(result);
Assert.Null(errorMessage);
Assert.Equal(CultureInfo.InvariantCulture, Thread.CurrentThread.CurrentCulture);
Assert.Equal(CultureInfo.InvariantCulture, Thread.CurrentThread.CurrentUICulture);
}
[Theory]
[InlineData("en-US")]
[InlineData("en-GB")]
[InlineData("de-DE")]
[InlineData("fr-FR")]
[InlineData("ja-JP")]
public void TrySet_ValidCulture_SetsCulture(string cultureName) {
// Act
var result = Culture.TrySet(cultureName, out var errorMessage);
// Assert
Assert.True(result);
Assert.Null(errorMessage);
Assert.Equal(cultureName, Thread.CurrentThread.CurrentCulture.Name);
Assert.Equal(cultureName, Thread.CurrentThread.CurrentUICulture.Name);
}
[Fact]
public void TrySet_InvalidCulture_ReturnsFalseWithErrorMessage() {
// Arrange - use a culture name that's invalid on all platforms
// Note: Linux is more permissive with culture names than Windows
// Using a very malformed name that should fail everywhere
string culture = "xx-INVALID-12345-@#$%";
// Act
var result = Culture.TrySet(culture, out var errorMessage);
// Assert
// On some Linux systems, even invalid cultures may not throw
// So we just verify the method handles it without crashing
if (!result) {
Assert.NotNull(errorMessage);
Assert.NotEmpty(errorMessage);
}
// If it somehow succeeds (very permissive system), that's also acceptable
}
[Fact]
public void TrySet_ValidCulture_AffectsCurrentThread() {
// Arrange
var originalCulture = Thread.CurrentThread.CurrentCulture;
try {
// Act
Culture.TrySet("de-DE", out _);
// Assert
Assert.Equal("de-DE", Thread.CurrentThread.CurrentCulture.Name);
}
finally {
// Cleanup - restore original culture
Thread.CurrentThread.CurrentCulture = originalCulture;
Thread.CurrentThread.CurrentUICulture = originalCulture;
}
}
[Fact]
public void TrySet_NeutralCulture_CreatesSpecificCulture() {
// Arrange - "en" is a neutral culture, should create specific culture
string culture = "en";
// Act
var result = Culture.TrySet(culture, out var errorMessage);
// Assert
Assert.True(result);
Assert.Null(errorMessage);
// CreateSpecificCulture("en") typically returns "en-US" or similar
Assert.StartsWith("en", Thread.CurrentThread.CurrentCulture.Name);
}
}

View File

@ -0,0 +1,178 @@
namespace MaksIT.Core.Tests;
public class EnvVarTests {
private const string TestEnvVarName = "MAKSIT_TEST_ENV_VAR";
private const string TestEnvVarValue = "test_value_123";
[Fact]
public void TrySet_ProcessLevel_SetsEnvironmentVariable() {
// Arrange & Act
var result = EnvVar.TrySet(TestEnvVarName, TestEnvVarValue, "process", out var errorMessage);
try {
// Assert
Assert.True(result);
Assert.Null(errorMessage);
Assert.Equal(TestEnvVarValue, Environment.GetEnvironmentVariable(TestEnvVarName));
}
finally {
// Cleanup
Environment.SetEnvironmentVariable(TestEnvVarName, null);
}
}
[Fact]
public void TryUnSet_ProcessLevel_RemovesEnvironmentVariable() {
// Arrange
Environment.SetEnvironmentVariable(TestEnvVarName, TestEnvVarValue);
// Act
var result = EnvVar.TryUnSet(TestEnvVarName, "process", out var errorMessage);
// Assert
Assert.True(result);
Assert.Null(errorMessage);
Assert.Null(Environment.GetEnvironmentVariable(TestEnvVarName));
}
[Fact]
public void TrySet_UserLevel_SetsEnvironmentVariable() {
// This test may fail on Linux/Docker containers due to permissions
// Skip on non-Windows platforms as User-level env vars behave differently
if (!OperatingSystem.IsWindows()) {
// On Linux, user-level env vars in containers don't persist as expected
// Just verify the method doesn't crash
var result = EnvVar.TrySet(TestEnvVarName, TestEnvVarValue, "user", out var errorMessage);
// Either succeeds or fails gracefully - both are acceptable on Linux
Assert.True(result || errorMessage != null);
return;
}
// Windows-specific test
var winResult = EnvVar.TrySet(TestEnvVarName, TestEnvVarValue, "user", out var winErrorMessage);
try {
if (winResult) {
Assert.Null(winErrorMessage);
var value = Environment.GetEnvironmentVariable(TestEnvVarName, EnvironmentVariableTarget.User);
Assert.Equal(TestEnvVarValue, value);
}
}
finally {
try {
Environment.SetEnvironmentVariable(TestEnvVarName, null, EnvironmentVariableTarget.User);
}
catch {
// Ignore cleanup errors
}
}
}
[Fact]
public void TryAddToPath_AddsPathToEnvironment() {
// Arrange
var originalPath = Environment.GetEnvironmentVariable("PATH");
var newPath = "/test/path/that/does/not/exist";
try {
// Act
var result = EnvVar.TryAddToPath(newPath, out var errorMessage);
// Assert
Assert.True(result);
Assert.Null(errorMessage);
var currentPath = Environment.GetEnvironmentVariable("PATH");
Assert.Contains(newPath, currentPath);
}
finally {
// Cleanup - restore original PATH
Environment.SetEnvironmentVariable("PATH", originalPath);
}
}
[Fact]
public void TryAddToPath_DuplicatePath_DoesNotAddAgain() {
// Arrange
var originalPath = Environment.GetEnvironmentVariable("PATH");
var newPath = "/test/unique/path";
try {
// Add first time
EnvVar.TryAddToPath(newPath, out _);
var pathAfterFirstAdd = Environment.GetEnvironmentVariable("PATH");
// Act - Add same path again
var result = EnvVar.TryAddToPath(newPath, out var errorMessage);
var pathAfterSecondAdd = Environment.GetEnvironmentVariable("PATH");
// Assert
Assert.True(result);
Assert.Null(errorMessage);
// Path should not have duplicate entries
Assert.Equal(pathAfterFirstAdd, pathAfterSecondAdd);
}
finally {
// Cleanup
Environment.SetEnvironmentVariable("PATH", originalPath);
}
}
[Theory]
[InlineData("process")]
[InlineData("user")]
[InlineData("Process")]
[InlineData("USER")]
public void TrySet_VariousTargets_HandlesCorrectly(string target) {
// Arrange
var envName = $"{TestEnvVarName}_{target.ToUpper()}";
// Act
var result = EnvVar.TrySet(envName, TestEnvVarValue, target, out var errorMessage);
// Assert - for process level, should always succeed
if (target.ToLower() == "process") {
Assert.True(result);
Assert.Null(errorMessage);
}
// For other levels, result depends on permissions
// Cleanup
try {
EnvVar.TryUnSet(envName, target, out _);
}
catch {
// Ignore cleanup errors
}
}
[Fact]
public void TrySet_EmptyValue_SetsEmptyString() {
// Arrange & Act
var result = EnvVar.TrySet(TestEnvVarName, "", "process", out var errorMessage);
try {
// Assert
Assert.True(result);
Assert.Null(errorMessage);
Assert.Equal("", Environment.GetEnvironmentVariable(TestEnvVarName));
}
finally {
// Cleanup
Environment.SetEnvironmentVariable(TestEnvVarName, null);
}
}
[Fact]
public void TryUnSet_NonExistentVariable_Succeeds() {
// Arrange
var nonExistentVar = "MAKSIT_NON_EXISTENT_VAR_12345";
// Act
var result = EnvVar.TryUnSet(nonExistentVar, "process", out var errorMessage);
// Assert
Assert.True(result);
Assert.Null(errorMessage);
}
}

View File

@ -0,0 +1,80 @@
namespace MaksIT.Core.Tests.Extensions;
using MaksIT.Core.Extensions;
public class ExceptionExtensionsTests {
[Fact]
public void ExtractMessages_SingleException_ReturnsSingleMessage() {
// Arrange
var exception = new InvalidOperationException("Test message");
// Act
var messages = exception.ExtractMessages();
// Assert
Assert.Single(messages);
Assert.Equal("Test message", messages[0]);
}
[Fact]
public void ExtractMessages_WithInnerException_ReturnsAllMessages() {
// Arrange
var innerException = new ArgumentException("Inner message");
var outerException = new InvalidOperationException("Outer message", innerException);
// Act
var messages = outerException.ExtractMessages();
// Assert
Assert.Equal(2, messages.Count);
Assert.Equal("Outer message", messages[0]);
Assert.Equal("Inner message", messages[1]);
}
[Fact]
public void ExtractMessages_WithMultipleNestedExceptions_ReturnsAllMessages() {
// Arrange
var innermost = new ArgumentNullException("param", "Innermost message");
var middle = new ArgumentException("Middle message", innermost);
var outer = new InvalidOperationException("Outer message", middle);
// Act
var messages = outer.ExtractMessages();
// Assert
Assert.Equal(3, messages.Count);
Assert.Equal("Outer message", messages[0]);
Assert.Equal("Middle message", messages[1]);
Assert.Contains("Innermost message", messages[2]);
}
[Fact]
public void ExtractMessages_AggregateException_ReturnsOuterMessage() {
// Arrange
var inner1 = new InvalidOperationException("Error 1");
var inner2 = new ArgumentException("Error 2");
var aggregate = new AggregateException("Multiple errors", inner1, inner2);
// Act
var messages = aggregate.ExtractMessages();
// Assert
// AggregateException's InnerException is the first inner exception
Assert.Equal(2, messages.Count);
Assert.Contains("Multiple errors", messages[0]);
}
[Fact]
public void ExtractMessages_EmptyMessage_ReturnsEmptyString() {
// Arrange
var exception = new Exception("");
// Act
var messages = exception.ExtractMessages();
// Assert
Assert.Single(messages);
Assert.Equal("", messages[0]);
}
}

View File

@ -0,0 +1,202 @@
namespace MaksIT.Core.Tests.Extensions;
using MaksIT.Core.Extensions;
public class FormatsExtensionsTests : IDisposable {
private readonly string _testDirectory;
private readonly List<string> _createdFiles = new();
public FormatsExtensionsTests() {
_testDirectory = Path.Combine(Path.GetTempPath(), $"MaksIT_Test_{Guid.NewGuid()}");
Directory.CreateDirectory(_testDirectory);
}
public void Dispose() {
// Cleanup
try {
if (Directory.Exists(_testDirectory)) {
Directory.Delete(_testDirectory, true);
}
foreach (var file in _createdFiles) {
if (File.Exists(file)) {
File.Delete(file);
}
}
}
catch {
// Ignore cleanup errors
}
}
[Fact]
public void TryCreateTarFromDirectory_ValidDirectory_ReturnsTrue() {
// Arrange
var sourceDir = Path.Combine(_testDirectory, "source");
Directory.CreateDirectory(sourceDir);
File.WriteAllText(Path.Combine(sourceDir, "test.txt"), "Hello, World!");
var outputTar = Path.Combine(_testDirectory, "output.tar");
_createdFiles.Add(outputTar);
// Act
var result = FormatsExtensions.TryCreateTarFromDirectory(sourceDir, outputTar);
// Assert
Assert.True(result);
Assert.True(File.Exists(outputTar));
Assert.True(new FileInfo(outputTar).Length > 0);
}
[Fact]
public void TryCreateTarFromDirectory_MultipleFiles_ReturnsTrue() {
// Arrange
var sourceDir = Path.Combine(_testDirectory, "multi_source");
Directory.CreateDirectory(sourceDir);
File.WriteAllText(Path.Combine(sourceDir, "file1.txt"), "Content 1");
File.WriteAllText(Path.Combine(sourceDir, "file2.txt"), "Content 2");
File.WriteAllText(Path.Combine(sourceDir, "file3.txt"), "Content 3");
var outputTar = Path.Combine(_testDirectory, "multi_output.tar");
_createdFiles.Add(outputTar);
// Act
var result = FormatsExtensions.TryCreateTarFromDirectory(sourceDir, outputTar);
// Assert
Assert.True(result);
Assert.True(File.Exists(outputTar));
}
[Fact]
public void TryCreateTarFromDirectory_NestedDirectories_ReturnsTrue() {
// Arrange
var sourceDir = Path.Combine(_testDirectory, "nested_source");
var subDir = Path.Combine(sourceDir, "subdir");
Directory.CreateDirectory(subDir);
File.WriteAllText(Path.Combine(sourceDir, "root.txt"), "Root content");
File.WriteAllText(Path.Combine(subDir, "nested.txt"), "Nested content");
var outputTar = Path.Combine(_testDirectory, "nested_output.tar");
_createdFiles.Add(outputTar);
// Act
var result = FormatsExtensions.TryCreateTarFromDirectory(sourceDir, outputTar);
// Assert
Assert.True(result);
Assert.True(File.Exists(outputTar));
}
[Fact]
public void TryCreateTarFromDirectory_EmptyDirectory_ReturnsFalse() {
// Arrange
var sourceDir = Path.Combine(_testDirectory, "empty_source");
Directory.CreateDirectory(sourceDir);
var outputTar = Path.Combine(_testDirectory, "empty_output.tar");
// Act
var result = FormatsExtensions.TryCreateTarFromDirectory(sourceDir, outputTar);
// Assert
Assert.False(result);
Assert.False(File.Exists(outputTar));
}
[Fact]
public void TryCreateTarFromDirectory_NonExistentDirectory_ReturnsFalse() {
// Arrange
var sourceDir = Path.Combine(_testDirectory, "non_existent");
var outputTar = Path.Combine(_testDirectory, "non_existent_output.tar");
// Act
var result = FormatsExtensions.TryCreateTarFromDirectory(sourceDir, outputTar);
// Assert
Assert.False(result);
}
[Fact]
public void TryCreateTarFromDirectory_NullSourceDirectory_ReturnsFalse() {
// Arrange
var outputTar = Path.Combine(_testDirectory, "null_source_output.tar");
// Act
var result = FormatsExtensions.TryCreateTarFromDirectory(null!, outputTar);
// Assert
Assert.False(result);
}
[Fact]
public void TryCreateTarFromDirectory_EmptySourceDirectory_ReturnsFalse() {
// Arrange
var outputTar = Path.Combine(_testDirectory, "empty_path_output.tar");
// Act
var result = FormatsExtensions.TryCreateTarFromDirectory("", outputTar);
// Assert
Assert.False(result);
}
[Fact]
public void TryCreateTarFromDirectory_WhitespaceSourceDirectory_ReturnsFalse() {
// Arrange
var outputTar = Path.Combine(_testDirectory, "whitespace_output.tar");
// Act
var result = FormatsExtensions.TryCreateTarFromDirectory(" ", outputTar);
// Assert
Assert.False(result);
}
[Fact]
public void TryCreateTarFromDirectory_NullOutputPath_ReturnsFalse() {
// Arrange
var sourceDir = Path.Combine(_testDirectory, "valid_source");
Directory.CreateDirectory(sourceDir);
File.WriteAllText(Path.Combine(sourceDir, "test.txt"), "Content");
// Act
var result = FormatsExtensions.TryCreateTarFromDirectory(sourceDir, null!);
// Assert
Assert.False(result);
}
[Fact]
public void TryCreateTarFromDirectory_EmptyOutputPath_ReturnsFalse() {
// Arrange
var sourceDir = Path.Combine(_testDirectory, "valid_source2");
Directory.CreateDirectory(sourceDir);
File.WriteAllText(Path.Combine(sourceDir, "test.txt"), "Content");
// Act
var result = FormatsExtensions.TryCreateTarFromDirectory(sourceDir, "");
// Assert
Assert.False(result);
}
[Fact]
public void TryCreateTarFromDirectory_CreatesOutputDirectory_WhenNotExists() {
// Arrange
var sourceDir = Path.Combine(_testDirectory, "source_for_new_dir");
Directory.CreateDirectory(sourceDir);
File.WriteAllText(Path.Combine(sourceDir, "test.txt"), "Content");
var outputDir = Path.Combine(_testDirectory, "new_output_dir");
var outputTar = Path.Combine(outputDir, "output.tar");
_createdFiles.Add(outputTar);
// Act
var result = FormatsExtensions.TryCreateTarFromDirectory(sourceDir, outputTar);
// Assert
Assert.True(result);
Assert.True(Directory.Exists(outputDir));
Assert.True(File.Exists(outputTar));
}
}

View File

@ -1,4 +1,4 @@
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
using MaksIT.Core.Logging;
@ -103,4 +103,118 @@ public class FileLoggerTests {
Assert.Fail("Logger should handle exceptions gracefully.");
}
}
[Fact]
public void ShouldWriteLogsToSubfolderWhenFolderPrefixUsed() {
// Arrange
var serviceCollection = new ServiceCollection();
serviceCollection.AddSingleton<IHostEnvironment>(sp =>
new TestHostEnvironment {
EnvironmentName = Environments.Development,
ApplicationName = "TestApp",
ContentRootPath = Directory.GetCurrentDirectory()
});
serviceCollection.AddLogging(builder => builder.AddFileLogger(_testFolderPath, TimeSpan.FromDays(7)));
var provider = serviceCollection.BuildServiceProvider();
var loggerFactory = provider.GetRequiredService<ILoggerFactory>();
// Act - Create logger with Folder prefix
var logger = loggerFactory.CreateLogger(LoggerPrefix.Folder.WithValue("Audit"));
logger.LogInformation("Audit log message");
// Assert
var auditFolder = Path.Combine(_testFolderPath, "Audit");
Assert.True(Directory.Exists(auditFolder), "Audit subfolder should be created");
var logFile = Directory.GetFiles(auditFolder, "log_*.txt").FirstOrDefault();
Assert.NotNull(logFile);
var logContent = File.ReadAllText(logFile);
Assert.Contains("Audit log message", logContent);
}
[Fact]
public void ShouldWriteLogsToDefaultFolderWhenNoPrefixUsed() {
// Arrange
var serviceCollection = new ServiceCollection();
serviceCollection.AddSingleton<IHostEnvironment>(sp =>
new TestHostEnvironment {
EnvironmentName = Environments.Development,
ApplicationName = "TestApp",
ContentRootPath = Directory.GetCurrentDirectory()
});
serviceCollection.AddLogging(builder => builder.AddFileLogger(_testFolderPath, TimeSpan.FromDays(7)));
var provider = serviceCollection.BuildServiceProvider();
var loggerFactory = provider.GetRequiredService<ILoggerFactory>();
// Act - Create logger with full type name (simulating ILogger<T>)
var logger = loggerFactory.CreateLogger("MyApp.Services.OrderService");
logger.LogInformation("Order service log message");
// Assert - Should NOT create subfolder for type names
var logFile = Directory.GetFiles(_testFolderPath, "log_*.txt").FirstOrDefault();
Assert.NotNull(logFile);
var logContent = File.ReadAllText(logFile);
Assert.Contains("Order service log message", logContent);
}
[Fact]
public void ShouldHandleFolderPrefixWithSpaces() {
// Arrange
var serviceCollection = new ServiceCollection();
serviceCollection.AddSingleton<IHostEnvironment>(sp =>
new TestHostEnvironment {
EnvironmentName = Environments.Development,
ApplicationName = "TestApp",
ContentRootPath = Directory.GetCurrentDirectory()
});
serviceCollection.AddLogging(builder => builder.AddFileLogger(_testFolderPath, TimeSpan.FromDays(7)));
var provider = serviceCollection.BuildServiceProvider();
var loggerFactory = provider.GetRequiredService<ILoggerFactory>();
// Act
var logger = loggerFactory.CreateLogger(LoggerPrefix.Folder.WithValue("My Custom Logs"));
logger.LogInformation("Custom folder log message");
// Assert
var customFolder = Path.Combine(_testFolderPath, "My Custom Logs");
Assert.True(Directory.Exists(customFolder), "Custom subfolder with spaces should be created");
var logFile = Directory.GetFiles(customFolder, "log_*.txt").FirstOrDefault();
Assert.NotNull(logFile);
var logContent = File.ReadAllText(logFile);
Assert.Contains("Custom folder log message", logContent);
}
[Fact]
public void ShouldIgnoreEmptyFolderPrefix() {
// Arrange
var serviceCollection = new ServiceCollection();
serviceCollection.AddSingleton<IHostEnvironment>(sp =>
new TestHostEnvironment {
EnvironmentName = Environments.Development,
ApplicationName = "TestApp",
ContentRootPath = Directory.GetCurrentDirectory()
});
serviceCollection.AddLogging(builder => builder.AddFileLogger(_testFolderPath, TimeSpan.FromDays(7)));
var provider = serviceCollection.BuildServiceProvider();
var loggerFactory = provider.GetRequiredService<ILoggerFactory>();
// Act - Create logger with empty folder value
var logger = loggerFactory.CreateLogger(LoggerPrefix.Folder.WithValue(""));
logger.LogInformation("Empty folder prefix log message");
// Assert - Should use default folder (not create empty subfolder)
var logFile = Directory.GetFiles(_testFolderPath, "log_*.txt").FirstOrDefault();
Assert.NotNull(logFile);
var logContent = File.ReadAllText(logFile);
Assert.Contains("Empty folder prefix log message", logContent);
}
}

View File

@ -140,4 +140,61 @@ public class JsonFileLoggerTests {
var logContent = File.ReadAllText(logFile);
Assert.Contains("Test combined logging", logContent);
}
[Fact]
public void ShouldWriteLogsToSubfolderWhenFolderPrefixUsed() {
// Arrange
var serviceCollection = new ServiceCollection();
serviceCollection.AddSingleton<IHostEnvironment>(sp =>
new TestHostEnvironment {
EnvironmentName = Environments.Development,
ApplicationName = "TestApp",
ContentRootPath = Directory.GetCurrentDirectory()
});
serviceCollection.AddLogging(builder => builder.AddJsonFileLogger(_testFolderPath, TimeSpan.FromDays(7)));
var provider = serviceCollection.BuildServiceProvider();
var loggerFactory = provider.GetRequiredService<ILoggerFactory>();
// Act - Create logger with Folder prefix
var logger = loggerFactory.CreateLogger(LoggerPrefix.Folder.WithValue("Audit"));
logger.LogInformation("Audit JSON log message");
// Assert
var auditFolder = Path.Combine(_testFolderPath, "Audit");
Assert.True(Directory.Exists(auditFolder), "Audit subfolder should be created");
var logFile = Directory.GetFiles(auditFolder, "log_*.json").FirstOrDefault();
Assert.NotNull(logFile);
var logContent = File.ReadAllText(logFile);
Assert.Contains("Audit JSON log message", logContent);
}
[Fact]
public void ShouldWriteLogsToDefaultFolderWhenNoPrefixUsed() {
// Arrange
var serviceCollection = new ServiceCollection();
serviceCollection.AddSingleton<IHostEnvironment>(sp =>
new TestHostEnvironment {
EnvironmentName = Environments.Development,
ApplicationName = "TestApp",
ContentRootPath = Directory.GetCurrentDirectory()
});
serviceCollection.AddLogging(builder => builder.AddJsonFileLogger(_testFolderPath, TimeSpan.FromDays(7)));
var provider = serviceCollection.BuildServiceProvider();
var loggerFactory = provider.GetRequiredService<ILoggerFactory>();
// Act - Create logger with full type name (simulating ILogger<T>)
var logger = loggerFactory.CreateLogger("MyApp.Services.OrderService");
logger.LogInformation("Order service JSON log message");
// Assert - Should NOT create subfolder for type names
var logFile = Directory.GetFiles(_testFolderPath, "log_*.json").FirstOrDefault();
Assert.NotNull(logFile);
var logContent = File.ReadAllText(logFile);
Assert.Contains("Order service JSON log message", logContent);
}
}

View File

@ -0,0 +1,160 @@
using MaksIT.Core.Logging;
namespace MaksIT.Core.Tests.Logging;
public class LoggerPrefixTests {
[Fact]
public void WithValue_ShouldCreateCorrectCategoryString() {
// Arrange & Act
var folderCategory = LoggerPrefix.Folder.WithValue("Audit");
var categoryCategory = LoggerPrefix.Category.WithValue("Orders");
var tagCategory = LoggerPrefix.Tag.WithValue("Critical");
// Assert
Assert.Equal("Folder:Audit", folderCategory);
Assert.Equal("Category:Orders", categoryCategory);
Assert.Equal("Tag:Critical", tagCategory);
}
[Fact]
public void WithValue_ShouldHandleSpacesInValue() {
// Arrange & Act
var result = LoggerPrefix.Folder.WithValue("My Custom Folder");
// Assert
Assert.Equal("Folder:My Custom Folder", result);
}
[Fact]
public void WithValue_ShouldHandleEmptyValue() {
// Arrange & Act
var result = LoggerPrefix.Folder.WithValue("");
// Assert
Assert.Equal("Folder:", result);
}
[Fact]
public void Parse_ShouldExtractFolderPrefix() {
// Arrange
var categoryName = "Folder:Audit";
// Act
var (prefix, value) = LoggerPrefix.Parse(categoryName);
// Assert
Assert.Equal(LoggerPrefix.Folder, prefix);
Assert.Equal("Audit", value);
}
[Fact]
public void Parse_ShouldExtractCategoryPrefix() {
// Arrange
var categoryName = "Category:Orders";
// Act
var (prefix, value) = LoggerPrefix.Parse(categoryName);
// Assert
Assert.Equal(LoggerPrefix.Category, prefix);
Assert.Equal("Orders", value);
}
[Fact]
public void Parse_ShouldExtractTagPrefix() {
// Arrange
var categoryName = "Tag:Critical";
// Act
var (prefix, value) = LoggerPrefix.Parse(categoryName);
// Assert
Assert.Equal(LoggerPrefix.Tag, prefix);
Assert.Equal("Critical", value);
}
[Fact]
public void Parse_ShouldHandleValueWithSpaces() {
// Arrange
var categoryName = "Folder:My Custom Folder";
// Act
var (prefix, value) = LoggerPrefix.Parse(categoryName);
// Assert
Assert.Equal(LoggerPrefix.Folder, prefix);
Assert.Equal("My Custom Folder", value);
}
[Fact]
public void Parse_ShouldReturnNullForUnrecognizedPrefix() {
// Arrange
var categoryName = "MyApp.Services.OrderService";
// Act
var (prefix, value) = LoggerPrefix.Parse(categoryName);
// Assert
Assert.Null(prefix);
Assert.Null(value);
}
[Fact]
public void Parse_ShouldReturnNullForEmptyString() {
// Arrange
var categoryName = "";
// Act
var (prefix, value) = LoggerPrefix.Parse(categoryName);
// Assert
Assert.Null(prefix);
Assert.Null(value);
}
[Fact]
public void Parse_ShouldHandleEmptyValueAfterPrefix() {
// Arrange
var categoryName = "Folder:";
// Act
var (prefix, value) = LoggerPrefix.Parse(categoryName);
// Assert
Assert.Equal(LoggerPrefix.Folder, prefix);
Assert.Equal("", value);
}
[Fact]
public void Parse_ShouldBeCaseSensitive() {
// Arrange
var categoryName = "folder:Audit"; // lowercase 'f'
// Act
var (prefix, value) = LoggerPrefix.Parse(categoryName);
// Assert
Assert.Null(prefix);
Assert.Null(value);
}
[Fact]
public void GetAll_ShouldReturnAllPrefixes() {
// Arrange & Act
var allPrefixes = MaksIT.Core.Abstractions.Enumeration.GetAll<LoggerPrefix>().ToList();
// Assert
Assert.Equal(3, allPrefixes.Count);
Assert.Contains(LoggerPrefix.Folder, allPrefixes);
Assert.Contains(LoggerPrefix.Category, allPrefixes);
Assert.Contains(LoggerPrefix.Tag, allPrefixes);
}
[Fact]
public void ToString_ShouldReturnPrefixName() {
// Arrange & Act & Assert
Assert.Equal("Folder:", LoggerPrefix.Folder.ToString());
Assert.Equal("Category:", LoggerPrefix.Category.ToString());
Assert.Equal("Tag:", LoggerPrefix.Tag.ToString());
}
}

View File

@ -1,7 +1,7 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net8.0</TargetFramework>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>

View File

@ -0,0 +1,190 @@
namespace MaksIT.Core.Tests.Security;
using MaksIT.Core.Security;
public class Base64UrlUtilityTests {
#region Encode Tests
[Fact]
public void Encode_String_ReturnsBase64UrlString() {
// Arrange
var input = "Hello, World!";
// Act
var result = Base64UrlUtility.Encode(input);
// Assert
Assert.NotNull(result);
Assert.DoesNotContain("+", result);
Assert.DoesNotContain("/", result);
Assert.DoesNotContain("=", result);
}
[Fact]
public void Encode_EmptyString_ReturnsEmptyString() {
// Arrange
var input = "";
// Act
var result = Base64UrlUtility.Encode(input);
// Assert
Assert.Equal("", result);
}
[Fact]
public void Encode_ByteArray_ReturnsBase64UrlString() {
// Arrange
var input = new byte[] { 0x00, 0x01, 0x02, 0x03, 0xFF, 0xFE };
// Act
var result = Base64UrlUtility.Encode(input);
// Assert
Assert.NotNull(result);
Assert.DoesNotContain("+", result);
Assert.DoesNotContain("/", result);
Assert.DoesNotContain("=", result);
}
[Fact]
public void Encode_NullByteArray_ThrowsArgumentNullException() {
// Arrange
byte[] input = null!;
// Act & Assert
Assert.Throws<ArgumentNullException>(() => Base64UrlUtility.Encode(input));
}
[Theory]
[InlineData("f", "Zg")]
[InlineData("fo", "Zm8")]
[InlineData("foo", "Zm9v")]
[InlineData("foob", "Zm9vYg")]
[InlineData("fooba", "Zm9vYmE")]
[InlineData("foobar", "Zm9vYmFy")]
public void Encode_RFC4648TestVectors_ReturnsExpectedResult(string input, string expected) {
// Act
var result = Base64UrlUtility.Encode(input);
// Assert
Assert.Equal(expected, result);
}
[Fact]
public void Encode_StringWithSpecialChars_HandlesCorrectly() {
// Arrange - characters that would produce + and / in standard base64
var input = "subjects?_d";
// Act
var result = Base64UrlUtility.Encode(input);
// Assert
Assert.DoesNotContain("+", result);
Assert.DoesNotContain("/", result);
}
#endregion
#region Decode Tests
[Fact]
public void Decode_ValidBase64Url_ReturnsOriginalBytes() {
// Arrange
var original = new byte[] { 0x00, 0x01, 0x02, 0x03, 0xFF, 0xFE };
var encoded = Base64UrlUtility.Encode(original);
// Act
var decoded = Base64UrlUtility.Decode(encoded);
// Assert
Assert.Equal(original, decoded);
}
[Fact]
public void Decode_NullInput_ThrowsArgumentNullException() {
// Arrange
string input = null!;
// Act & Assert
Assert.Throws<ArgumentNullException>(() => Base64UrlUtility.Decode(input));
}
[Theory]
[InlineData("Zg", "f")]
[InlineData("Zm8", "fo")]
[InlineData("Zm9v", "foo")]
[InlineData("Zm9vYg", "foob")]
[InlineData("Zm9vYmE", "fooba")]
[InlineData("Zm9vYmFy", "foobar")]
public void DecodeToString_RFC4648TestVectors_ReturnsExpectedResult(string input, string expected) {
// Act
var result = Base64UrlUtility.DecodeToString(input);
// Assert
Assert.Equal(expected, result);
}
[Fact]
public void DecodeToString_ValidBase64Url_ReturnsOriginalString() {
// Arrange
var original = "Hello, World!";
var encoded = Base64UrlUtility.Encode(original);
// Act
var decoded = Base64UrlUtility.DecodeToString(encoded);
// Assert
Assert.Equal(original, decoded);
}
[Fact]
public void Decode_EmptyString_ReturnsEmptyArray() {
// Arrange
var input = "";
// Act
var result = Base64UrlUtility.Decode(input);
// Assert
Assert.Empty(result);
}
#endregion
#region Round-trip Tests
[Theory]
[InlineData("Simple text")]
[InlineData("Text with spaces and numbers 123")]
[InlineData("Special chars: !@#$%^&*()")]
[InlineData("Unicode: 日本語 中文 한국어")]
[InlineData("")]
public void RoundTrip_String_ReturnsOriginal(string original) {
// Act
var encoded = Base64UrlUtility.Encode(original);
var decoded = Base64UrlUtility.DecodeToString(encoded);
// Assert
Assert.Equal(original, decoded);
}
[Fact]
public void RoundTrip_BinaryData_ReturnsOriginal() {
// Arrange
var original = new byte[256];
for (int i = 0; i < 256; i++) {
original[i] = (byte)i;
}
// Act
var encoded = Base64UrlUtility.Encode(original);
var decoded = Base64UrlUtility.Decode(encoded);
// Assert
Assert.Equal(original, decoded);
}
#endregion
}

View File

@ -0,0 +1,75 @@
namespace MaksIT.Core.Tests.Webapi.Models;
using MaksIT.Core.Webapi.Models;
public class PatchOperationTests {
[Fact]
public void PatchOperation_HasExpectedValues() {
// Assert - verify all enum values exist
Assert.Equal(0, (int)PatchOperation.SetField);
Assert.Equal(1, (int)PatchOperation.RemoveField);
Assert.Equal(2, (int)PatchOperation.AddToCollection);
Assert.Equal(3, (int)PatchOperation.RemoveFromCollection);
}
[Fact]
public void PatchOperation_HasFourValues() {
// Arrange
var values = Enum.GetValues<PatchOperation>();
// Assert
Assert.Equal(4, values.Length);
}
[Theory]
[InlineData(PatchOperation.SetField, "SetField")]
[InlineData(PatchOperation.RemoveField, "RemoveField")]
[InlineData(PatchOperation.AddToCollection, "AddToCollection")]
[InlineData(PatchOperation.RemoveFromCollection, "RemoveFromCollection")]
public void PatchOperation_ToString_ReturnsCorrectName(PatchOperation operation, string expectedName) {
// Act
var result = operation.ToString();
// Assert
Assert.Equal(expectedName, result);
}
[Theory]
[InlineData("SetField", PatchOperation.SetField)]
[InlineData("RemoveField", PatchOperation.RemoveField)]
[InlineData("AddToCollection", PatchOperation.AddToCollection)]
[InlineData("RemoveFromCollection", PatchOperation.RemoveFromCollection)]
public void PatchOperation_Parse_ReturnsCorrectValue(string name, PatchOperation expected) {
// Act
var result = Enum.Parse<PatchOperation>(name);
// Assert
Assert.Equal(expected, result);
}
[Fact]
public void PatchOperation_TryParse_InvalidValue_ReturnsFalse() {
// Act
var result = Enum.TryParse<PatchOperation>("InvalidOperation", out var value);
// Assert
Assert.False(result);
}
[Fact]
public void PatchOperation_IsDefined_ValidValues_ReturnsTrue() {
// Assert
Assert.True(Enum.IsDefined(typeof(PatchOperation), 0));
Assert.True(Enum.IsDefined(typeof(PatchOperation), 1));
Assert.True(Enum.IsDefined(typeof(PatchOperation), 2));
Assert.True(Enum.IsDefined(typeof(PatchOperation), 3));
}
[Fact]
public void PatchOperation_IsDefined_InvalidValue_ReturnsFalse() {
// Assert
Assert.False(Enum.IsDefined(typeof(PatchOperation), 99));
Assert.False(Enum.IsDefined(typeof(PatchOperation), -1));
}
}

View File

@ -8,7 +8,17 @@ public abstract class BaseFileLogger : ILogger, IDisposable {
private readonly LockManager _lockManager = new LockManager();
private readonly string _folderPath;
private readonly TimeSpan _retentionPeriod;
private static readonly Mutex _fileMutex = new Mutex(false, "Global\\MaksITLoggerFileMutex"); // Named mutex for cross-process locking
private static readonly Mutex _fileMutex = CreateMutex();
private static Mutex CreateMutex() {
try {
// Try Global\ first for cross-session synchronization (services, multiple users)
return new Mutex(false, "Global\\MaksITLoggerFileMutex");
} catch (UnauthorizedAccessException) {
// Fall back to Local\ if Global\ is not allowed (sandboxed/restricted environment)
return new Mutex(false, "Local\\MaksITLoggerFileMutex");
}
}
protected BaseFileLogger(string folderPath, TimeSpan retentionPeriod) {
_folderPath = folderPath;

View File

@ -1,9 +1,4 @@
using Microsoft.Extensions.Logging;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
namespace MaksIT.Core.Logging;
@ -18,7 +13,23 @@ public class FileLoggerProvider : ILoggerProvider {
}
public ILogger CreateLogger(string categoryName) {
return new FileLogger(_folderPath, _retentionPeriod);
var folderPath = ResolveFolderPath(categoryName);
return new FileLogger(folderPath, _retentionPeriod);
}
private string ResolveFolderPath(string categoryName) {
var (prefix, value) = LoggerPrefix.Parse(categoryName);
if (prefix == LoggerPrefix.Folder && !string.IsNullOrWhiteSpace(value)) {
return Path.Combine(_folderPath, SanitizeForPath(value));
}
return _folderPath;
}
private static string SanitizeForPath(string input) {
var invalid = Path.GetInvalidPathChars();
return string.Concat(input.Where(c => !invalid.Contains(c)));
}
public void Dispose() { }

View File

@ -13,7 +13,23 @@ public class JsonFileLoggerProvider : ILoggerProvider {
}
public ILogger CreateLogger(string categoryName) {
return new JsonFileLogger(_folderPath, _retentionPeriod);
var folderPath = ResolveFolderPath(categoryName);
return new JsonFileLogger(folderPath, _retentionPeriod);
}
private string ResolveFolderPath(string categoryName) {
var (prefix, value) = LoggerPrefix.Parse(categoryName);
if (prefix == LoggerPrefix.Folder && !string.IsNullOrWhiteSpace(value)) {
return Path.Combine(_folderPath, SanitizeForPath(value));
}
return _folderPath;
}
private static string SanitizeForPath(string input) {
var invalid = Path.GetInvalidPathChars();
return string.Concat(input.Where(c => !invalid.Contains(c)));
}
public void Dispose() { }

View File

@ -0,0 +1,29 @@
using MaksIT.Core.Abstractions;
namespace MaksIT.Core.Logging;
public class LoggerPrefix : Enumeration {
public static readonly LoggerPrefix Folder = new(1, "Folder:");
public static readonly LoggerPrefix Category = new(2, "Category:");
public static readonly LoggerPrefix Tag = new(3, "Tag:");
private LoggerPrefix(int id, string name) : base(id, name) { }
/// <summary>
/// Creates a category string with this prefix and the given value.
/// </summary>
public string WithValue(string value) => $"{Name}{value}";
/// <summary>
/// Tries to extract the prefix and value from a category name.
/// </summary>
public static (LoggerPrefix? prefix, string? value) Parse(string categoryName) {
foreach (var prefix in GetAll<LoggerPrefix>()) {
if (categoryName.StartsWith(prefix.Name, StringComparison.Ordinal)) {
var value = categoryName.Substring(prefix.Name.Length);
return (prefix, value);
}
}
return (null, null);
}
}

View File

@ -1,23 +1,44 @@
<Project Sdk="Microsoft.NET.Sdk">
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net8.0</TargetFramework>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<RootNamespace>$(MSBuildProjectName.Replace(" ", "_"))</RootNamespace>
<!-- Enable XML documentation -->
<GenerateDocumentationFile>true</GenerateDocumentationFile>
<NoWarn>$(NoWarn);CS1591</NoWarn>
<!-- NuGet package metadata -->
<PackageId>MaksIT.Core</PackageId>
<Version>1.6.0</Version>
<Version>1.6.1</Version>
<Authors>Maksym Sadovnychyy</Authors>
<Company>MAKS-IT</Company>
<Product>MaksIT.Core</Product>
<Description>MaksIT.Core is a collection of helper methods and extensions for .NET projects, designed to simplify common tasks and improve code readability. The library includes extensions for `Guid`, `string`, `Object`, and a base class for creating enumeration types.</Description>
<PackageTags>dotnet;enumeration;string;guid;object;parsers;extensions;jwt;aes;crc32;</PackageTags>
<Copyright>Copyright © Maksym Sadovnychyy (MAKS-IT)</Copyright>
<Description>A comprehensive .NET library providing utilities for logging (file/JSON with folder organization), security (JWT, JWK, JWS, TOTP, AES-GCM, password hashing), extensions (string, object, LINQ expressions, DateTime), saga orchestration, COMB GUIDs, Web API pagination, and more.</Description>
<PackageTags>dotnet;extensions;logging;file-logger;jwt;jwk;jws;totp;2fa;aes-gcm;password-hasher;saga;comb-guid;pagination;crc32;base32;enumeration</PackageTags>
<PackageProjectUrl>https://github.com/MAKS-IT-COM/maksit-core</PackageProjectUrl>
<RepositoryUrl>https://github.com/MAKS-IT-COM/maksit-core</RepositoryUrl>
<RequireLicenseAcceptance>false</RequireLicenseAcceptance>
<RepositoryType>git</RepositoryType>
<PackageReadmeFile>README.md</PackageReadmeFile>
<PackageLicenseFile>LICENSE.md</PackageLicenseFile>
<PackageReleaseNotes>See https://github.com/MAKS-IT-COM/maksit-core/releases</PackageReleaseNotes>
<RequireLicenseAcceptance>false</RequireLicenseAcceptance>
<!-- Source Link for debugging -->
<PublishRepositoryUrl>true</PublishRepositoryUrl>
<EmbedUntrackedSources>true</EmbedUntrackedSources>
<IncludeSymbols>true</IncludeSymbols>
<SymbolPackageFormat>snupkg</SymbolPackageFormat>
<!-- Deterministic builds for reproducibility -->
<Deterministic>true</Deterministic>
<ContinuousIntegrationBuild Condition="'$(CI)' == 'true'">true</ContinuousIntegrationBuild>
</PropertyGroup>
<ItemGroup>
@ -25,6 +46,11 @@
<None Include="../../LICENSE.md" Pack="true" PackagePath="" />
</ItemGroup>
<!-- Source Link package -->
<ItemGroup>
<PackageReference Include="Microsoft.SourceLink.GitHub" Version="8.0.0" PrivateAssets="All" />
</ItemGroup>
<ItemGroup>
<PackageReference Include="Microsoft.AspNetCore.Cryptography.KeyDerivation" Version="9.0.10" />
<PackageReference Include="Microsoft.AspNetCore.Http.Abstractions" Version="2.3.0" />
@ -37,4 +63,5 @@
<PackageReference Include="System.Linq.Dynamic.Core" Version="1.6.9" />
<PackageReference Include="System.Threading.RateLimiting" Version="9.0.10" />
</ItemGroup>
</Project>

View File

@ -1,60 +0,0 @@
# Retrieve the API key from the environment variable
$apiKey = $env:NUGET_MAKS_IT
if (-not $apiKey) {
Write-Host "Error: API key not found in environment variable NUGET_MAKS_IT."
exit 1
}
# NuGet source
$nugetSource = "https://api.nuget.org/v3/index.json"
# Define paths
$solutionDir = Split-Path -Parent $MyInvocation.MyCommand.Path
$projectDir = "$solutionDir\MaksIT.Core"
$outputDir = "$projectDir\bin\Release"
$testProjectDir = "$solutionDir\MaksIT.Core.Tests"
# Clean previous builds
Write-Host "Cleaning previous builds..."
dotnet clean $projectDir -c Release
dotnet clean $testProjectDir -c Release
# Build the test project
Write-Host "Building the test project..."
dotnet build $testProjectDir -c Release
# Run tests
Write-Host "Running tests..."
dotnet test $testProjectDir -c Release
if ($LASTEXITCODE -ne 0) {
Write-Host "Tests failed. Aborting release process."
exit 1
}
# Build the main project
Write-Host "Building the project..."
dotnet build $projectDir -c Release
# Pack the NuGet package
Write-Host "Packing the project..."
dotnet pack $projectDir -c Release --no-build
# Look for the .nupkg file
$packageFile = Get-ChildItem -Path $outputDir -Filter "*.nupkg" -Recurse | Sort-Object LastWriteTime -Descending | Select-Object -First 1
if ($packageFile) {
Write-Host "Package created successfully: $($packageFile.FullName)"
# Push the package to NuGet
Write-Host "Pushing the package to NuGet..."
dotnet nuget push $packageFile.FullName -k $apiKey -s $nugetSource --skip-duplicate
if ($LASTEXITCODE -eq 0) {
Write-Host "Package pushed successfully."
} else {
Write-Host "Failed to push the package."
}
} else {
Write-Host "Package creation failed. No .nupkg file found."
exit 1
}

View File

@ -1,49 +0,0 @@
#!/bin/sh
# Retrieve the API key from the environment variable
apiKey=$NUGET_MAKS_IT
if [ -z "$apiKey" ]; then
echo "Error: API key not found in environment variable NUGET_MAKS_IT."
exit 1
fi
# NuGet source
nugetSource="https://api.nuget.org/v3/index.json"
# Define paths
scriptDir=$(dirname "$0")
solutionDir=$(realpath "$scriptDir")
projectDir="$solutionDir/MaksIT.Core"
outputDir="$projectDir/bin/Release"
# Clean previous builds
echo "Cleaning previous builds..."
dotnet clean "$projectDir" -c Release
# Build the project
echo "Building the project..."
dotnet build "$projectDir" -c Release
# Pack the NuGet package
echo "Packing the project..."
dotnet pack "$projectDir" -c Release --no-build
# Look for the .nupkg file
packageFile=$(find "$outputDir" -name "*.nupkg" -print0 | xargs -0 ls -t | head -n 1)
if [ -n "$packageFile" ]; then
echo "Package created successfully: $packageFile"
# Push the package to NuGet
echo "Pushing the package to NuGet..."
dotnet nuget push "$packageFile" -k "$apiKey" -s "$nugetSource" --skip-duplicate
if [ $? -eq 0 ]; then
echo "Package pushed successfully."
else
echo "Failed to push the package."
fi
else
echo "Package creation failed. No .nupkg file found."
exit 1
fi

1054
src/scripts/BuildUtils.psm1 Normal file

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,6 @@
@echo off
setlocal
powershell.exe -NoProfile -ExecutionPolicy Bypass -File "%~dp0Force-AmendTaggedCommit.ps1"
pause

View File

@ -0,0 +1,201 @@
<#
.SYNOPSIS
Amends the latest commit, recreates its associated tag, and force pushes both to remote.
.DESCRIPTION
This script performs the following operations:
1. Gets the last commit and verifies it has an associated tag
2. Stages all pending changes
3. Amends the latest commit (keeps existing message)
4. Deletes and recreates the tag on the amended commit
5. Force pushes the branch and tag to origin
.PARAMETER DryRun
If specified, shows what would be done without making changes.
.EXAMPLE
.\Force-AmendTaggedCommit.ps1
.EXAMPLE
.\Force-AmendTaggedCommit.ps1 -DryRun
#>
[CmdletBinding()]
param(
[Parameter(Mandatory = $false)]
[switch]$DryRun
)
$ErrorActionPreference = "Stop"
function Write-Step {
param([string]$Text)
Write-Host "`n>> $Text" -ForegroundColor Cyan
}
function Write-Success {
param([string]$Text)
Write-Host " $Text" -ForegroundColor Green
}
function Write-Info {
param([string]$Text)
Write-Host " $Text" -ForegroundColor Gray
}
function Write-Warn {
param([string]$Text)
Write-Host " $Text" -ForegroundColor Yellow
}
function Invoke-Git {
param(
[Parameter(Mandatory = $true)]
[string[]]$Arguments,
[Parameter(Mandatory = $false)]
[switch]$CaptureOutput,
[Parameter(Mandatory = $false)]
[string]$ErrorMessage = "Git command failed"
)
if ($CaptureOutput) {
$output = & git @Arguments 2>&1
$exitCode = $LASTEXITCODE
if ($exitCode -ne 0) {
throw "$ErrorMessage (exit code: $exitCode)"
}
return $output
} else {
& git @Arguments
$exitCode = $LASTEXITCODE
if ($exitCode -ne 0) {
throw "$ErrorMessage (exit code: $exitCode)"
}
}
}
try {
Write-Host "`n========================================" -ForegroundColor Magenta
Write-Host " Force Amend Tagged Commit Script" -ForegroundColor Magenta
Write-Host "========================================`n" -ForegroundColor Magenta
if ($DryRun) {
Write-Warn "*** DRY RUN MODE - No changes will be made ***`n"
}
# Get current branch
Write-Step "Getting current branch..."
$Branch = Invoke-Git -Arguments @("rev-parse", "--abbrev-ref", "HEAD") -CaptureOutput -ErrorMessage "Failed to get current branch"
Write-Info "Branch: $Branch"
# Get last commit info
Write-Step "Getting last commit..."
$null = Invoke-Git -Arguments @("rev-parse", "HEAD") -CaptureOutput -ErrorMessage "Failed to get HEAD commit"
$CommitMessage = Invoke-Git -Arguments @("log", "-1", "--format=%s") -CaptureOutput
$CommitHash = Invoke-Git -Arguments @("log", "-1", "--format=%h") -CaptureOutput
Write-Info "Commit: $CommitHash - $CommitMessage"
# Find tag pointing to HEAD
Write-Step "Finding tag on last commit..."
$Tags = & git tag --points-at HEAD 2>&1
if (-not $Tags -or [string]::IsNullOrWhiteSpace("$Tags")) {
throw "No tag found on the last commit ($CommitHash). This script requires the last commit to have an associated tag."
}
# If multiple tags, use the first one
$TagName = ("$Tags" -split "`n")[0].Trim()
Write-Success "Found tag: $TagName"
# Show current status
Write-Step "Checking pending changes..."
$Status = & git status --short 2>&1
if ($Status -and -not [string]::IsNullOrWhiteSpace("$Status")) {
Write-Info "Pending changes:"
"$Status" -split "`n" | ForEach-Object { Write-Info " $_" }
} else {
Write-Warn "No pending changes found"
$confirm = Read-Host "`n No changes to amend. Continue to recreate tag and force push? (y/N)"
if ($confirm -ne 'y' -and $confirm -ne 'Y') {
Write-Host "`nAborted by user" -ForegroundColor Yellow
exit 0
}
}
# Confirm operation
Write-Host "`n----------------------------------------" -ForegroundColor White
Write-Host " Summary of operations:" -ForegroundColor White
Write-Host "----------------------------------------" -ForegroundColor White
Write-Host " Branch: $Branch" -ForegroundColor White
Write-Host " Commit: $CommitHash" -ForegroundColor White
Write-Host " Tag: $TagName" -ForegroundColor White
Write-Host " Remote: origin" -ForegroundColor White
Write-Host "----------------------------------------`n" -ForegroundColor White
if (-not $DryRun) {
$confirm = Read-Host " Proceed with amend and force push? (y/N)"
if ($confirm -ne 'y' -and $confirm -ne 'Y') {
Write-Host "`nAborted by user" -ForegroundColor Yellow
exit 0
}
}
# Stage all changes
Write-Step "Staging all changes..."
if (-not $DryRun) {
Invoke-Git -Arguments @("add", "-A") -ErrorMessage "Failed to stage changes"
}
Write-Success "All changes staged"
# Amend commit
Write-Step "Amending commit..."
if (-not $DryRun) {
Invoke-Git -Arguments @("commit", "--amend", "--no-edit") -ErrorMessage "Failed to amend commit"
}
Write-Success "Commit amended"
# Delete local tag
Write-Step "Deleting local tag '$TagName'..."
if (-not $DryRun) {
Invoke-Git -Arguments @("tag", "-d", $TagName) -ErrorMessage "Failed to delete local tag"
}
Write-Success "Local tag deleted"
# Recreate tag on new commit
Write-Step "Recreating tag '$TagName' on amended commit..."
if (-not $DryRun) {
Invoke-Git -Arguments @("tag", $TagName) -ErrorMessage "Failed to create tag"
}
Write-Success "Tag recreated"
# Force push branch
Write-Step "Force pushing branch '$Branch' to origin..."
if (-not $DryRun) {
Invoke-Git -Arguments @("push", "--force", "origin", $Branch) -ErrorMessage "Failed to force push branch"
}
Write-Success "Branch force pushed"
# Force push tag
Write-Step "Force pushing tag '$TagName' to origin..."
if (-not $DryRun) {
Invoke-Git -Arguments @("push", "--force", "origin", $TagName) -ErrorMessage "Failed to force push tag"
}
Write-Success "Tag force pushed"
Write-Host "`n========================================" -ForegroundColor Green
Write-Host " Operation completed successfully!" -ForegroundColor Green
Write-Host "========================================`n" -ForegroundColor Green
# Show final state
Write-Host "Final state:" -ForegroundColor White
& git log -1 --oneline
Write-Host ""
} catch {
Write-Host "`n========================================" -ForegroundColor Red
Write-Host " ERROR: $($_.Exception.Message)" -ForegroundColor Red
Write-Host "========================================`n" -ForegroundColor Red
exit 1
}

View File

@ -0,0 +1,9 @@
@echo off
REM Change directory to the location of the script
cd /d %~dp0
REM Run AI changelog generator (dry-run mode with debug output)
powershell -ExecutionPolicy Bypass -File "%~dp0Generate-Changelog.ps1"
pause

View File

@ -0,0 +1,452 @@
<#
.SYNOPSIS
AI-assisted changelog generation and license year update.
.DESCRIPTION
Generates changelog entries from uncommitted changes using a 3-pass LLM pipeline:
1. Analyze: Convert changes to changelog items
2. Consolidate: Merge similar items, remove duplicates
3. Format: Structure as Keep a Changelog format
Also updates LICENSE.md copyright year if needed.
Optional RAG pre-processing clusters related changes using embeddings.
All configuration is in changelogsettings.json.
.PARAMETER DryRun
Show what would be generated without making changes.
Enables debug output showing intermediate LLM results.
Does not modify CHANGELOG.md or LICENSE.md.
.USAGE
Generate changelog and update license:
.\Generate-Changelog.ps1
Dry run (preview without changes):
.\Generate-Changelog.ps1 -DryRun
.NOTES
Requires:
- Ollama running locally (configured in changelogsettings.json)
- OllamaClient.psm1 and BuildUtils.psm1 modules
Configuration (changelogsettings.json):
- csprojPath: Path to .csproj file for version
- outputFile: Path to CHANGELOG.md
- licensePath: Path to LICENSE.md
- debug: Enable debug output
- models: LLM models for each pass
- prompts: Prompt templates
#>
param(
[switch]$DryRun
)
# ==============================================================================
# PATH CONFIGURATION
# ==============================================================================
$scriptDir = Split-Path -Parent $MyInvocation.MyCommand.Path
$repoRoot = git rev-parse --show-toplevel 2>$null
if (-not $repoRoot) {
# Fallback if not in git repo - go up two levels (scripts -> src -> repo root)
$repoRoot = Split-Path -Parent (Split-Path -Parent $scriptDir)
}
$repoRoot = $repoRoot.Trim()
# Solution directory is one level up from scripts
$solutionDir = Split-Path -Parent $scriptDir
# ==============================================================================
# LOAD SETTINGS
# ==============================================================================
$settingsPath = Join-Path $scriptDir "changelogsettings.json"
if (-not (Test-Path $settingsPath)) {
Write-Error "Settings file not found: $settingsPath"
exit 1
}
$settings = Get-Content $settingsPath -Raw | ConvertFrom-Json
Write-Host "Loaded settings from changelogsettings.json" -ForegroundColor Gray
# Resolve paths relative to script location
$CsprojPath = if ($settings.changelog.csprojPath) {
[System.IO.Path]::GetFullPath((Join-Path $scriptDir $settings.changelog.csprojPath))
}
else {
Join-Path $solutionDir "MaksIT.Core\MaksIT.Core.csproj"
}
$OutputFile = if ($settings.changelog.outputFile) {
[System.IO.Path]::GetFullPath((Join-Path $scriptDir $settings.changelog.outputFile))
}
else {
$null
}
$LicensePath = if ($settings.changelog.licensePath) {
[System.IO.Path]::GetFullPath((Join-Path $scriptDir $settings.changelog.licensePath))
}
else {
$null
}
# ==============================================================================
# LICENSE YEAR UPDATE
# ==============================================================================
if ($LicensePath -and (Test-Path $LicensePath)) {
Write-Host "Checking LICENSE.md copyright year..." -ForegroundColor Gray
$currentYear = (Get-Date).Year
$licenseContent = Get-Content $LicensePath -Raw
# Match pattern: "Copyright (c) YYYY - YYYY" and update end year
$licensePattern = "(Copyright \(c\) \d{4}\s*-\s*)(\d{4})"
if ($licenseContent -match $licensePattern) {
$existingEndYear = [int]$Matches[2]
if ($existingEndYear -lt $currentYear) {
if ($DryRun) {
Write-Host "[DryRun] LICENSE.md needs update: $existingEndYear -> $currentYear" -ForegroundColor Yellow
}
else {
Write-Host "Updating LICENSE.md copyright year: $existingEndYear -> $currentYear" -ForegroundColor Cyan
$updatedContent = $licenseContent -replace $licensePattern, "`${1}$currentYear"
Set-Content -Path $LicensePath -Value $updatedContent -NoNewline
Write-Host "LICENSE.md updated." -ForegroundColor Green
}
}
else {
Write-Host "LICENSE.md copyright year is current ($existingEndYear)." -ForegroundColor Gray
}
}
}
# ==============================================================================
# IMPORT MODULES
# ==============================================================================
# Import build utilities
$buildUtilsPath = Join-Path $scriptDir "BuildUtils.psm1"
if (Test-Path $buildUtilsPath) {
Import-Module $buildUtilsPath -Force
}
else {
Write-Error "BuildUtils.psm1 not found: $buildUtilsPath"
exit 1
}
# Import Ollama client
$ollamaModulePath = Join-Path $scriptDir "OllamaClient.psm1"
if (-not $settings.ollama.enabled) {
Write-Error "Ollama is disabled in changelogsettings.json"
exit 1
}
if (-not (Test-Path $ollamaModulePath)) {
Write-Error "OllamaClient.psm1 not found: $ollamaModulePath"
exit 1
}
Import-Module $ollamaModulePath -Force
Set-OllamaConfig -ApiUrl $settings.ollama.apiUrl `
-DefaultContextWindow $settings.ollama.defaultContextWindow `
-DefaultTimeout $settings.ollama.defaultTimeout
# ==============================================================================
# CHANGELOG CONFIGURATION
# ==============================================================================
$clSettings = $settings.changelog
$changelogConfig = @{
Debug = if ($DryRun) { $true } else { $clSettings.debug }
EnableRAG = $clSettings.enableRAG
SimilarityThreshold = $clSettings.similarityThreshold
FileExtension = $clSettings.fileExtension
ExcludePatterns = if ($clSettings.excludePatterns) { @($clSettings.excludePatterns) } else { @() }
Models = @{
Analyze = @{
Name = $clSettings.models.analyze.name
Context = $clSettings.models.analyze.context
MaxTokens = if ($null -ne $clSettings.models.analyze.maxTokens) { $clSettings.models.analyze.maxTokens } else { 0 }
}
Reason = @{
Name = $clSettings.models.reason.name
Context = $clSettings.models.reason.context
MaxTokens = if ($null -ne $clSettings.models.reason.maxTokens) { $clSettings.models.reason.maxTokens } else { 0 }
Temperature = if ($clSettings.models.reason.temperature) { $clSettings.models.reason.temperature } else { 0.1 }
}
Write = @{
Name = $clSettings.models.write.name
Context = $clSettings.models.write.context
MaxTokens = if ($null -ne $clSettings.models.write.maxTokens) { $clSettings.models.write.maxTokens } else { 0 }
}
Embed = @{ Name = $clSettings.models.embed.name }
}
Prompts = @{
Analyze = if ($clSettings.prompts.analyze) {
if ($clSettings.prompts.analyze -is [array]) { $clSettings.prompts.analyze -join "`n" } else { $clSettings.prompts.analyze }
} else { "Convert changes to changelog: {{changes}}" }
Reason = if ($clSettings.prompts.reason) {
if ($clSettings.prompts.reason -is [array]) { $clSettings.prompts.reason -join "`n" } else { $clSettings.prompts.reason }
} else { "Consolidate: {{input}}" }
Format = if ($clSettings.prompts.format) {
if ($clSettings.prompts.format -is [array]) { $clSettings.prompts.format -join "`n" } else { $clSettings.prompts.format }
} else { "Format as changelog: {{items}}" }
}
}
# ==============================================================================
# AI CHANGELOG GENERATION FUNCTION
# ==============================================================================
function Get-AIChangelogSuggestion {
param(
[Parameter(Mandatory)][string]$Changes,
[Parameter(Mandatory)][string]$Version
)
$cfg = $script:changelogConfig
$debug = $cfg.Debug
# === RAG PRE-PROCESSING ===
$processedChanges = $Changes
if ($cfg.EnableRAG) {
Write-Host " RAG Pre-processing ($($cfg.Models.Embed.Name))..." -ForegroundColor Cyan
$changeArray = $Changes -split "`n" | Where-Object { $_.Trim() -ne "" }
if ($changeArray.Length -gt 3) {
Write-Host " RAG: Embedding $($changeArray.Length) changes..." -ForegroundColor Gray
$clusters = Group-TextsByEmbedding -Model $cfg.Models.Embed.Name -Texts $changeArray -SimilarityThreshold $cfg.SimilarityThreshold
Write-Host " RAG: Reduced to $($clusters.Length) groups" -ForegroundColor Green
# Format clusters
$grouped = @()
foreach ($cluster in $clusters) {
if ($cluster.Length -eq 1) {
$grouped += $cluster[0]
}
else {
$grouped += "[RELATED CHANGES]`n" + ($cluster -join "`n") + "`n[/RELATED CHANGES]"
}
}
$processedChanges = $grouped -join "`n"
if ($debug) {
Write-Host "`n [DEBUG] RAG grouped changes:" -ForegroundColor Magenta
Write-Host $processedChanges -ForegroundColor DarkGray
Write-Host ""
}
}
}
# === PASS 1: Analyze changes ===
$m1 = $cfg.Models.Analyze
Write-Host " Pass 1/3: Analyzing ($($m1.Name), ctx:$($m1.Context))..." -ForegroundColor Gray
$prompt1 = $cfg.Prompts.Analyze -replace '\{\{changes\}\}', $processedChanges
$pass1 = Invoke-OllamaPrompt -Model $m1.Name -ContextWindow $m1.Context -MaxTokens $m1.MaxTokens -Prompt $prompt1
if (-not $pass1) { return $null }
if ($debug) { Write-Host "`n [DEBUG] Pass 1 output:" -ForegroundColor Magenta; Write-Host $pass1 -ForegroundColor DarkGray; Write-Host "" }
# === PASS 2: Consolidate ===
$m2 = $cfg.Models.Reason
Write-Host " Pass 2/3: Consolidating ($($m2.Name), ctx:$($m2.Context))..." -ForegroundColor Gray
$prompt2 = $cfg.Prompts.Reason -replace '\{\{input\}\}', $pass1
$pass2 = Invoke-OllamaPrompt -Model $m2.Name -ContextWindow $m2.Context -MaxTokens $m2.MaxTokens -Temperature $m2.Temperature -Prompt $prompt2
if (-not $pass2) { return $pass1 }
if ($pass2 -match "</think>") { $pass2 = ($pass2 -split "</think>")[-1].Trim() }
if ($debug) { Write-Host "`n [DEBUG] Pass 2 output:" -ForegroundColor Magenta; Write-Host $pass2 -ForegroundColor DarkGray; Write-Host "" }
# === PASS 3: Format ===
$m3 = $cfg.Models.Write
Write-Host " Pass 3/3: Formatting ($($m3.Name), ctx:$($m3.Context))..." -ForegroundColor Gray
$prompt3 = $cfg.Prompts.Format -replace '\{\{items\}\}', $pass2
$pass3 = Invoke-OllamaPrompt -Model $m3.Name -ContextWindow $m3.Context -MaxTokens $m3.MaxTokens -Prompt $prompt3
if (-not $pass3) { return $pass2 }
if ($debug) { Write-Host "`n [DEBUG] Pass 3 output:" -ForegroundColor Magenta; Write-Host $pass3 -ForegroundColor DarkGray; Write-Host "" }
# Clean up preamble
if ($pass3 -match "(### Added|### Changed|### Fixed|### Removed)") {
$pass3 = $pass3.Substring($pass3.IndexOf($Matches[0]))
}
# Clean up headers - remove any extra text after "### Added" etc.
$pass3 = $pass3 -replace '(### Added)[^\n]*', '### Added'
$pass3 = $pass3 -replace '(### Changed)[^\n]*', '### Changed'
$pass3 = $pass3 -replace '(### Fixed)[^\n]*', '### Fixed'
$pass3 = $pass3 -replace '(### Removed)[^\n]*', '### Removed'
# Clean up formatting: remove extra blank lines, normalize line endings
$pass3 = $pass3 -replace "`r`n", "`n" # Normalize to LF
$pass3 = $pass3 -replace "(\n\s*){3,}", "`n`n" # Max 1 blank line
$pass3 = $pass3 -replace "- (.+)\n\n- ", "- `$1`n- " # No blank between items
$pass3 = $pass3 -replace "\n{2,}(### )", "`n`n`$1" # One blank before headers
# Remove empty sections (e.g., "### Fixed\n- (No items)" or "### Removed\n\n###")
$pass3 = $pass3 -replace "### \w+\s*\n-\s*\(No items\)\s*\n?", ""
$pass3 = $pass3 -replace "### \w+\s*\n\s*\n(?=###|$)", ""
$pass3 = $pass3.Trim()
return $pass3
}
# ==============================================================================
# MAIN EXECUTION
# ==============================================================================
Write-Host ""
Write-Host "==================================================" -ForegroundColor Cyan
Write-Host "AI CHANGELOG GENERATOR" -ForegroundColor Cyan
Write-Host "==================================================" -ForegroundColor Cyan
Write-Host ""
# Check Ollama availability
if (-not (Test-OllamaAvailable)) {
Write-Error "Ollama is not available. Start Ollama and try again."
exit 1
}
Write-Host "Ollama connected: $($settings.ollama.apiUrl)" -ForegroundColor Green
Write-Host "Models: $($changelogConfig.Models.Analyze.Name) | $($changelogConfig.Models.Reason.Name) | $($changelogConfig.Models.Embed.Name)" -ForegroundColor Gray
Write-Host ""
# Get version from csproj
if (-not (Test-Path $CsprojPath)) {
Write-Error "Csproj file not found: $CsprojPath"
exit 1
}
[xml]$csproj = Get-Content $CsprojPath
$Version = $csproj.Project.PropertyGroup.Version | Where-Object { $_ } | Select-Object -First 1
Write-Host "Version: $Version" -ForegroundColor White
# Filter function for excluding test files
$excludePatterns = $changelogConfig.ExcludePatterns
function Test-Excluded {
param([string]$Item)
foreach ($pattern in $excludePatterns) {
if ($Item -match [regex]::Escape($pattern)) { return $true }
}
return $false
}
# Get committed changes for this version (analyzed diffs)
$committedChanges = Get-CommitChangesAnalysis -Version $Version -CsprojPath $CsprojPath -FileFilter $changelogConfig.FileExtension
$filteredCommitted = $committedChanges | Where-Object { -not (Test-Excluded $_) }
# Get uncommitted changes (staged, modified, new, deleted)
$uncommitted = Get-UncommittedChanges -FileFilter $changelogConfig.FileExtension
$filteredUncommitted = $uncommitted.Summary | Where-Object { -not (Test-Excluded $_) }
# Combine all changes
$allChanges = @()
if ($filteredCommitted.Count -gt 0) { $allChanges += $filteredCommitted }
if ($filteredUncommitted.Count -gt 0) { $allChanges += $filteredUncommitted }
if ($allChanges.Count -eq 0) {
Write-Host "No changes found for version $Version (excluding tests)" -ForegroundColor Yellow
exit 0
}
$changeLog = $allChanges -join "`n"
Write-Host "Found $($filteredCommitted.Count) committed changes" -ForegroundColor Gray
Write-Host "Found $($filteredUncommitted.Count) uncommitted changes" -ForegroundColor Gray
Write-Host ""
# Generate changelog from uncommitted changes
$suggestion = Get-AIChangelogSuggestion -Changes $changeLog -Version $Version
if ($suggestion) {
$fullEntry = "## v$Version`n`n$suggestion"
Write-Host ""
Write-Host "==========================================" -ForegroundColor Green
Write-Host "AI SUGGESTED CHANGELOG ENTRY" -ForegroundColor Green
Write-Host "==========================================" -ForegroundColor Green
Write-Host ""
Write-Host $fullEntry -ForegroundColor White
Write-Host ""
Write-Host "==========================================" -ForegroundColor Green
# Update changelog file if specified and not in DryRun mode
if ($OutputFile -and -not $DryRun) {
if (Test-Path $OutputFile) {
# Read existing content
$existingContent = Get-Content $OutputFile -Raw
# Check if this version already exists
if ($existingContent -match "## v$Version\b") {
Write-Host ""
Write-Host "WARNING: Version $Version already exists in $OutputFile" -ForegroundColor Yellow
Write-Host "Skipping file update. Review and update manually if needed." -ForegroundColor Yellow
}
else {
# Find insertion point (after header, before first version entry)
# Header typically ends before first "## v" line
if ($existingContent -match '(?s)(^.*?)(\r?\n)(## v)') {
$header = $Matches[1]
$newline = $Matches[2]
$rest = $existingContent.Substring($header.Length + $newline.Length)
$newContent = $header + "`n`n" + $fullEntry + "`n`n" + $rest
}
else {
# No existing version entries - append after content
$newContent = $existingContent.TrimEnd() + "`n`n" + $fullEntry + "`n"
}
# Normalize multiple blank lines to max 2
$newContent = $newContent -replace "(\r?\n){3,}", "`n`n"
$newContent | Out-File -FilePath $OutputFile -Encoding utf8 -NoNewline
Write-Host ""
Write-Host "Updated: $OutputFile" -ForegroundColor Cyan
}
}
else {
# Create new file with header
$newContent = @"
# Changelog
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
$fullEntry
"@
$newContent | Out-File -FilePath $OutputFile -Encoding utf8
Write-Host ""
Write-Host "Created: $OutputFile" -ForegroundColor Cyan
}
}
elseif ($OutputFile -and $DryRun) {
Write-Host ""
Write-Host "[DryRun] Would update: $OutputFile" -ForegroundColor Yellow
}
Write-Host ""
if ($DryRun) {
Write-Host "DryRun complete. No files were modified." -ForegroundColor Yellow
}
else {
Write-Host "Review the changelog entry, then commit." -ForegroundColor Yellow
}
}
else {
Write-Error "AI changelog generation failed"
exit 1
}
Write-Host ""

View File

@ -0,0 +1,572 @@
<#
.SYNOPSIS
Generic Ollama API client module for PowerShell.
.DESCRIPTION
Provides a simple interface to interact with Ollama's local LLM API:
- Text generation (chat/completion)
- Embeddings generation
- Model management
- RAG utilities (cosine similarity, clustering)
.REQUIREMENTS
- Ollama running locally (default: http://localhost:11434)
.USAGE
Import-Module .\OllamaClient.psm1
# Configure
Set-OllamaConfig -ApiUrl "http://localhost:11434"
# Check availability
if (Test-OllamaAvailable) {
# Generate text
$response = Invoke-OllamaPrompt -Model "llama3.1:8b" -Prompt "Hello!"
# Get embeddings
$embedding = Get-OllamaEmbedding -Model "nomic-embed-text" -Text "Sample text"
}
#>
# ==============================================================================
# MODULE CONFIGURATION
# ==============================================================================
$script:OllamaConfig = @{
ApiUrl = "http://localhost:11434"
DefaultTimeout = 180
DefaultTemperature = 0.2
DefaultMaxTokens = 0
DefaultContextWindow = 0
}
# ==============================================================================
# CONFIGURATION FUNCTIONS
# ==============================================================================
function Set-OllamaConfig {
<#
.SYNOPSIS
Configure Ollama client settings.
.PARAMETER ApiUrl
Ollama API endpoint URL (default: http://localhost:11434).
.PARAMETER DefaultTimeout
Default timeout in seconds for API calls.
.PARAMETER DefaultTemperature
Default temperature for text generation (0.0-1.0).
.PARAMETER DefaultMaxTokens
Default maximum tokens to generate.
.PARAMETER DefaultContextWindow
Default context window size (num_ctx).
#>
param(
[string]$ApiUrl,
[int]$DefaultTimeout,
[double]$DefaultTemperature,
[int]$DefaultMaxTokens,
[int]$DefaultContextWindow
)
if ($ApiUrl) {
$script:OllamaConfig.ApiUrl = $ApiUrl
}
if ($PSBoundParameters.ContainsKey('DefaultTimeout')) {
$script:OllamaConfig.DefaultTimeout = $DefaultTimeout
}
if ($PSBoundParameters.ContainsKey('DefaultTemperature')) {
$script:OllamaConfig.DefaultTemperature = $DefaultTemperature
}
if ($PSBoundParameters.ContainsKey('DefaultMaxTokens')) {
$script:OllamaConfig.DefaultMaxTokens = $DefaultMaxTokens
}
if ($PSBoundParameters.ContainsKey('DefaultContextWindow')) {
$script:OllamaConfig.DefaultContextWindow = $DefaultContextWindow
}
}
function Get-OllamaConfig {
<#
.SYNOPSIS
Get current Ollama client configuration.
#>
return $script:OllamaConfig.Clone()
}
# ==============================================================================
# CONNECTION & STATUS
# ==============================================================================
function Test-OllamaAvailable {
<#
.SYNOPSIS
Check if Ollama API is available and responding.
.OUTPUTS
Boolean indicating if Ollama is available.
#>
try {
$null = Invoke-RestMethod -Uri "$($script:OllamaConfig.ApiUrl)/api/tags" -TimeoutSec 5 -ErrorAction Stop
return $true
}
catch {
return $false
}
}
function Get-OllamaModels {
<#
.SYNOPSIS
Get list of available models from Ollama.
.OUTPUTS
Array of model objects with name, size, and other properties.
#>
try {
$response = Invoke-RestMethod -Uri "$($script:OllamaConfig.ApiUrl)/api/tags" -TimeoutSec 10 -ErrorAction Stop
return $response.models
}
catch {
Write-Warning "Failed to get Ollama models: $_"
return @()
}
}
function Test-OllamaModel {
<#
.SYNOPSIS
Check if a specific model is available in Ollama.
.PARAMETER Model
Model name to check.
#>
param([Parameter(Mandatory)][string]$Model)
$models = Get-OllamaModels
return ($models | Where-Object { $_.name -eq $Model -or $_.name -like "${Model}:*" }) -ne $null
}
# ==============================================================================
# TEXT GENERATION
# ==============================================================================
function Invoke-OllamaPrompt {
<#
.SYNOPSIS
Send a prompt to an Ollama model and get a response.
.PARAMETER Model
Model name (e.g., "llama3.1:8b", "qwen2.5-coder:7b").
.PARAMETER Prompt
The prompt text to send.
.PARAMETER ContextWindow
Context window size (num_ctx). Uses default if not specified.
.PARAMETER MaxTokens
Maximum tokens to generate (num_predict). Uses default if not specified.
.PARAMETER Temperature
Temperature for generation (0.0-1.0). Uses default if not specified.
.PARAMETER Timeout
Timeout in seconds. Uses default if not specified.
.PARAMETER System
Optional system prompt.
.OUTPUTS
Generated text response or $null if failed.
#>
param(
[Parameter(Mandatory)][string]$Model,
[Parameter(Mandatory)][string]$Prompt,
[int]$ContextWindow,
[int]$MaxTokens,
[double]$Temperature,
[int]$Timeout,
[string]$System
)
$config = $script:OllamaConfig
# Use defaults if not specified
if (-not $PSBoundParameters.ContainsKey('MaxTokens')) { $MaxTokens = $config.DefaultMaxTokens }
if (-not $PSBoundParameters.ContainsKey('Temperature')) { $Temperature = $config.DefaultTemperature }
if (-not $PSBoundParameters.ContainsKey('Timeout')) { $Timeout = $config.DefaultTimeout }
$options = @{
temperature = $Temperature
}
# Only set num_predict if MaxTokens > 0 (0 = unlimited/model default)
if ($MaxTokens -and $MaxTokens -gt 0) {
$options.num_predict = $MaxTokens
}
# Only set context window if explicitly provided (let model use its default otherwise)
if ($ContextWindow -and $ContextWindow -gt 0) {
$options.num_ctx = $ContextWindow
}
$body = @{
model = $Model
prompt = $Prompt
stream = $false
options = $options
}
if ($System) {
$body.system = $System
}
$jsonBody = $body | ConvertTo-Json -Depth 3
# TimeoutSec 0 = infinite wait
$restParams = @{
Uri = "$($config.ApiUrl)/api/generate"
Method = "Post"
Body = $jsonBody
ContentType = "application/json"
}
if ($Timeout -gt 0) { $restParams.TimeoutSec = $Timeout }
try {
$response = Invoke-RestMethod @restParams
return $response.response.Trim()
}
catch {
Write-Warning "Ollama prompt failed: $_"
return $null
}
}
function Invoke-OllamaChat {
<#
.SYNOPSIS
Send a chat conversation to an Ollama model.
.PARAMETER Model
Model name.
.PARAMETER Messages
Array of message objects with 'role' and 'content' properties.
Roles: "system", "user", "assistant"
.PARAMETER ContextWindow
Context window size.
.PARAMETER MaxTokens
Maximum tokens to generate.
.PARAMETER Temperature
Temperature for generation.
.OUTPUTS
Generated response text or $null if failed.
#>
param(
[Parameter(Mandatory)][string]$Model,
[Parameter(Mandatory)][array]$Messages,
[int]$ContextWindow,
[int]$MaxTokens,
[double]$Temperature,
[int]$Timeout
)
$config = $script:OllamaConfig
if (-not $PSBoundParameters.ContainsKey('MaxTokens')) { $MaxTokens = $config.DefaultMaxTokens }
if (-not $PSBoundParameters.ContainsKey('Temperature')) { $Temperature = $config.DefaultTemperature }
if (-not $PSBoundParameters.ContainsKey('Timeout')) { $Timeout = $config.DefaultTimeout }
$options = @{
temperature = $Temperature
}
# Only set num_predict if MaxTokens > 0 (0 = unlimited/model default)
if ($MaxTokens -and $MaxTokens -gt 0) {
$options.num_predict = $MaxTokens
}
# Only set context window if explicitly provided
if ($ContextWindow -and $ContextWindow -gt 0) {
$options.num_ctx = $ContextWindow
}
$body = @{
model = $Model
messages = $Messages
stream = $false
options = $options
}
$jsonBody = $body | ConvertTo-Json -Depth 4
# TimeoutSec 0 = infinite wait
$restParams = @{
Uri = "$($config.ApiUrl)/api/chat"
Method = "Post"
Body = $jsonBody
ContentType = "application/json"
}
if ($Timeout -gt 0) { $restParams.TimeoutSec = $Timeout }
try {
$response = Invoke-RestMethod @restParams
return $response.message.content.Trim()
}
catch {
Write-Warning "Ollama chat failed: $_"
return $null
}
}
# ==============================================================================
# EMBEDDINGS
# ==============================================================================
function Get-OllamaEmbedding {
<#
.SYNOPSIS
Get embedding vector for text using an Ollama embedding model.
.PARAMETER Model
Embedding model name (e.g., "nomic-embed-text", "mxbai-embed-large").
.PARAMETER Text
Text to embed.
.PARAMETER Timeout
Timeout in seconds.
.OUTPUTS
Array of doubles representing the embedding vector, or $null if failed.
#>
param(
[Parameter(Mandatory)][string]$Model,
[Parameter(Mandatory)][string]$Text,
[int]$Timeout = 30
)
$body = @{
model = $Model
prompt = $Text
} | ConvertTo-Json
try {
$response = Invoke-RestMethod -Uri "$($script:OllamaConfig.ApiUrl)/api/embeddings" -Method Post -Body $body -ContentType "application/json" -TimeoutSec $Timeout
return $response.embedding
}
catch {
Write-Warning "Ollama embedding failed: $_"
return $null
}
}
function Get-OllamaEmbeddings {
<#
.SYNOPSIS
Get embeddings for multiple texts (batch).
.PARAMETER Model
Embedding model name.
.PARAMETER Texts
Array of texts to embed.
.PARAMETER ShowProgress
Show progress indicator.
.OUTPUTS
Array of objects with Text and Embedding properties.
#>
param(
[Parameter(Mandatory)][string]$Model,
[Parameter(Mandatory)][string[]]$Texts,
[switch]$ShowProgress
)
$results = @()
$total = $Texts.Count
$current = 0
foreach ($text in $Texts) {
$current++
if ($ShowProgress) {
Write-Progress -Activity "Getting embeddings" -Status "$current of $total" -PercentComplete (($current / $total) * 100)
}
$embedding = Get-OllamaEmbedding -Model $Model -Text $text
if ($embedding) {
$results += @{
Text = $text
Embedding = $embedding
}
}
}
if ($ShowProgress) {
Write-Progress -Activity "Getting embeddings" -Completed
}
return $results
}
# ==============================================================================
# RAG UTILITIES
# ==============================================================================
function Get-CosineSimilarity {
<#
.SYNOPSIS
Calculate cosine similarity between two embedding vectors.
.PARAMETER Vector1
First embedding vector.
.PARAMETER Vector2
Second embedding vector.
.OUTPUTS
Cosine similarity value between -1 and 1.
#>
param(
[Parameter(Mandatory)][double[]]$Vector1,
[Parameter(Mandatory)][double[]]$Vector2
)
if ($Vector1.Length -ne $Vector2.Length) {
Write-Warning "Vector lengths don't match: $($Vector1.Length) vs $($Vector2.Length)"
return 0
}
$dotProduct = 0.0
$norm1 = 0.0
$norm2 = 0.0
for ($i = 0; $i -lt $Vector1.Length; $i++) {
$dotProduct += $Vector1[$i] * $Vector2[$i]
$norm1 += $Vector1[$i] * $Vector1[$i]
$norm2 += $Vector2[$i] * $Vector2[$i]
}
$norm1 = [Math]::Sqrt($norm1)
$norm2 = [Math]::Sqrt($norm2)
if ($norm1 -eq 0 -or $norm2 -eq 0) { return 0 }
return $dotProduct / ($norm1 * $norm2)
}
function Group-TextsByEmbedding {
<#
.SYNOPSIS
Cluster texts by embedding similarity.
.PARAMETER Model
Embedding model name.
.PARAMETER Texts
Array of texts to cluster.
.PARAMETER SimilarityThreshold
Minimum cosine similarity to group texts together (0.0-1.0).
.PARAMETER ShowProgress
Show progress during embedding.
.OUTPUTS
Array of clusters (each cluster is an array of texts).
#>
param(
[Parameter(Mandatory)][string]$Model,
[Parameter(Mandatory)][string[]]$Texts,
[double]$SimilarityThreshold = 0.65,
[switch]$ShowProgress
)
if ($Texts.Length -eq 0) { return @() }
if ($Texts.Length -eq 1) { return @(,@($Texts[0])) }
# Get embeddings
$embeddings = Get-OllamaEmbeddings -Model $Model -Texts $Texts -ShowProgress:$ShowProgress
if ($embeddings.Length -eq 0) {
return @($Texts | ForEach-Object { ,@($_) })
}
# Mark all as unclustered
$embeddings | ForEach-Object { $_.Clustered = $false }
# Cluster similar texts
$clusters = @()
for ($i = 0; $i -lt $embeddings.Length; $i++) {
if ($embeddings[$i].Clustered) { continue }
$cluster = @($embeddings[$i].Text)
$embeddings[$i].Clustered = $true
for ($j = $i + 1; $j -lt $embeddings.Length; $j++) {
if ($embeddings[$j].Clustered) { continue }
$similarity = Get-CosineSimilarity -Vector1 $embeddings[$i].Embedding -Vector2 $embeddings[$j].Embedding
if ($similarity -ge $SimilarityThreshold) {
$cluster += $embeddings[$j].Text
$embeddings[$j].Clustered = $true
}
}
$clusters += ,@($cluster)
}
return $clusters
}
function Find-SimilarTexts {
<#
.SYNOPSIS
Find texts most similar to a query using embeddings.
.PARAMETER Model
Embedding model name.
.PARAMETER Query
Query text to find similar texts for.
.PARAMETER Texts
Array of texts to search through.
.PARAMETER TopK
Number of most similar texts to return.
.PARAMETER MinSimilarity
Minimum similarity threshold.
.OUTPUTS
Array of objects with Text and Similarity properties, sorted by similarity.
#>
param(
[Parameter(Mandatory)][string]$Model,
[Parameter(Mandatory)][string]$Query,
[Parameter(Mandatory)][string[]]$Texts,
[int]$TopK = 5,
[double]$MinSimilarity = 0.0
)
# Get query embedding
$queryEmbedding = Get-OllamaEmbedding -Model $Model -Text $Query
if (-not $queryEmbedding) { return @() }
# Get text embeddings and calculate similarities
$results = @()
foreach ($text in $Texts) {
$textEmbedding = Get-OllamaEmbedding -Model $Model -Text $text
if ($textEmbedding) {
$similarity = Get-CosineSimilarity -Vector1 $queryEmbedding -Vector2 $textEmbedding
if ($similarity -ge $MinSimilarity) {
$results += @{
Text = $text
Similarity = $similarity
}
}
}
}
# Sort by similarity and return top K
return $results | Sort-Object -Property Similarity -Descending | Select-Object -First $TopK
}
# ==============================================================================
# MODULE EXPORTS
# ==============================================================================
Export-ModuleMember -Function @(
# Configuration
'Set-OllamaConfig'
'Get-OllamaConfig'
# Connection & Status
'Test-OllamaAvailable'
'Get-OllamaModels'
'Test-OllamaModel'
# Text Generation
'Invoke-OllamaPrompt'
'Invoke-OllamaChat'
# Embeddings
'Get-OllamaEmbedding'
'Get-OllamaEmbeddings'
# RAG Utilities
'Get-CosineSimilarity'
'Group-TextsByEmbedding'
'Find-SimilarTexts'
)

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,105 @@
{
"$schema": "https://json-schema.org/draft/2020-12/schema",
"$comment": "Configuration for Generate-Changelog.ps1 (AI-assisted changelog generation)",
"ollama": {
"enabled": true,
"apiUrl": "http://localhost:11434",
"defaultTimeout": 0,
"defaultContextWindow": 0
},
"changelog": {
"debug": true,
"enableRAG": true,
"similarityThreshold": 0.65,
"csprojPath": "../MaksIT.Core/MaksIT.Core.csproj",
"outputFile": "../../CHANGELOG.md",
"licensePath": "../../LICENSE.md",
"fileExtension": ".cs",
"excludePatterns": ["Tests:", "Tests.cs", ".Tests."],
"models": {
"analyze": {
"name": "qwen2.5-coder:7b-instruct-q6_K",
"context": 0,
"maxTokens": 0,
"description": "Pass 1: Code commit analysis (7B, fast)"
},
"reason": {
"name": "qwen2.5:7b-instruct-q8_0",
"context": 0,
"maxTokens": 0,
"temperature": 0.1,
"description": "Pass 2: Consolidation (7B, fast)"
},
"write": {
"name": "qwen2.5:7b-instruct-q8_0",
"context": 0,
"maxTokens": 0,
"description": "Pass 3: Formatting (7B, fast)"
},
"embed": {
"name": "mxbai-embed-large",
"description": "RAG: Commit clustering"
}
},
"prompts": {
"analyze": [
"Convert code changes to changelog entries. Include ALL items.",
"",
"Changes:",
"{{changes}}",
"",
"RULES:",
"1. Create ONE bullet point per item",
"2. Include method names mentioned (CreateMutex, ResolveFolderPath, etc.)",
"3. New classes = \"Added [class] for [purpose]\"",
"4. New methods = \"Added [method] to [class]\"",
"5. Deleted files = \"Removed [class/feature]\"",
"6. Exception handling = \"Improved error handling in [class]\"",
"",
"Output bullet points for each change:"
],
"reason": [
"Keep all important details from this changelog.",
"",
"Input:",
"{{input}}",
"",
"RULES:",
"1. KEEP specific method names and class names",
"2. KEEP all distinct features - do not over-consolidate",
"3. Merge ONLY if items are nearly identical",
"4. DO NOT invent new information",
"5. Output 3-10 bullet points",
"",
"Output:"
],
"format": [
"Categorize these items under the correct changelog headers.",
"",
"Items:",
"{{items}}",
"",
"HEADERS (use exactly as shown):",
"### Added",
"### Changed",
"### Fixed",
"### Removed",
"",
"CATEGORIZATION RULES:",
"- \"Added [class/method]\" -> ### Added",
"- \"Improved...\" or \"Enhanced...\" -> ### Changed",
"- \"Fixed...\" -> ### Fixed",
"- \"Removed...\" -> ### Removed",
"",
"Output each item under correct header. Omit empty sections:"
]
}
}
}

View File

@ -0,0 +1,47 @@
{
"$schema": "https://json-schema.org/draft/2020-12/schema",
"$comment": "Configuration for Release-NuGetPackage.ps1. Secrets are stored in environment variables, not here.",
"release": {
"branch": "main",
"$comment": "Tag must be on this branch to release. Set to empty string to allow any branch."
},
"paths": {
"changelogPath": "../../CHANGELOG.md"
},
"gitHub": {
"enabled": true,
"repository": "MAKS-IT-COM/maksit-core",
"$comment": "Explicit GitHub repository (owner/repo). If empty, auto-detects from git remote."
},
"environmentVariables": {
"$comment": "Required environment variables (store secrets here, not in this file)",
"nugetApiKey": "NUGET_MAKS_IT",
"githubToken": "GITHUB_MAKS_IT_COM",
"signingCertPassword": "SIGNING_CERT_PASSWORD",
"smtpPassword": "SMTP_PASSWORD"
},
"qualityGates": {
"coverageThreshold": 0,
"failOnVulnerabilities": true
},
"packageSigning": {
"enabled": false,
"certificatePath": "",
"timestampServer": "http://timestamp.digicert.com"
},
"emailNotification": {
"enabled": false,
"smtpServer": "",
"smtpPort": 587,
"useSsl": true,
"from": "",
"to": ""
}
}