skip to Main Content

I have a pipeline that was working fine with a .net 6 project, but after updating it to .net 8, the pipeline has been locking up. It will reach a certain point and the timer will keep running, but nothing is happening. It eventually times out around 59 minutes.

This is the last few lines in the log, previous to this there are 2 CoreCompile commands that complete fine for two smaller projects:

 CoreCompile:
         /usr/share/dotnet/dotnet exec "/usr/share/dotnet/sdk/8.0.402/Roslyn/bincore/csc.dll" /noconfig /unsafe- /checked- /nowarn:1701,1702,1701,1702 /fullpaths /nostdlib+ /errorreport:prompt /warn:8 /define:TRACE;DEBUG;NET;NET8_0;NETCOREAPP;NET5_0_OR_GREATER;NET6_0_OR_GREATER;NET7_0_OR_GREATER;NET8_0_OR_GREATER;NETCOREAPP1_0_OR_GREATER;NETCOREAPP1_1_OR_GREATER;NETCOREAPP2_0_OR_GREATER;NETCOREAPP2_1_OR_GREATER;NETCOREAPP2_2_OR_GREATER;NETCOREAPP3_0_OR_GREATER;NETCOREAPP3_1_OR_GREATER /highentropyva+ /reference:/home/vsts_azpcontainer/.nuget/packages/csvhelper/27.2.1/lib/net6.0/CsvHelper.dll /reference:/home/vsts_azpcontainer/.nuget/packages/easydata.core/1.4.9/lib/net6.0/EasyData.Core.dll /reference:/home/vsts_azpcontainer/.nuget/packages/humanizer.core/2.14.1/lib/net6.0/Humanizer.dll /reference:/home/vsts_azpcontainer/.nuget/packages/korzh.easyquery.db/7.2.2/lib/netstandard2.0/Korzh.EasyQuery.Db.dll /reference:/home/vsts_azpcontainer/.nuget/packages/korzh.easyquery/7.2.2/lib/netstandard2.0/Korzh.EasyQuery.d...
##[warning]Free memory is lower than 5%; Currently used: 95.61%
 CompilerServer: server failed - server rejected the request 'Error reading response: Reached end of stream before end of read.' - MyProject.Entities (net8.0)
##[error]The Operation will be canceled. The next steps may not contain expected logs.
##[error]The operation was canceled.

Project file for the compile that fails:

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <TargetFramework>net8.0</TargetFramework>
  </PropertyGroup>
  <ItemGroup>
    <PackageReference Include="CsvHelper" Version="27.2.1" />
    <PackageReference Include="Korzh.EasyQuery.EntityFrameworkCore.Relational" Version="7.2.2" />
    <PackageReference Include="Npgsql" Version="8.0.4" />
    <PackageReference Include="Npgsql.EntityFrameworkCore.PostgreSQL" Version="8.0.8" />
    <PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="8.0.8" />
    <PackageReference Include="Microsoft.EntityFrameworkCore.InMemory" Version="8.0.8" />
    <PackageReference Include="Microsoft.EntityFrameworkCore.Relational" Version="8.0.8" />
    <PackageReference Include="Newtonsoft.Json" Version="13.0.3" />
  </ItemGroup>
  <ItemGroup>
    <Folder Include="MigrationsFunctions" />
  </ItemGroup>
  <ItemGroup>
    <ProjectReference Include="..MyProject.SharedMyProject.Shared.csproj" />
  </ItemGroup>
</Project>

When I run this command locally, in Visual Studio (Package Manager Console), it runs the entire publish command in under 4 minutes.

Pipeline Step (copies a javascript front end from previous step into the dotnet API to be hosted, and builds the back end with publish):

job: BuildDotnet
    dependsOn: BuildJavascript
    pool:
      vmImage: ubuntu-latest
    container: myproject-core-base-8-new 
    steps:
    - download: current
      artifact: javascript-client
    - bash: |
        cd $(System.DefaultWorkingDirectory)/server/back-end/MyProject.API
        mkdir -p out/Scripts
        npm config fix
        yarn config set registry (jfrog repo)
        npm config set registry (jfrog repo)
        npm config fix
        echo 'npm install'
        npm install
        echo 'copy files'
        cp -rf $(Pipeline.Workspace)/javascript-client/ $(System.DefaultWorkingDirectory)/server/back-end/MyProject.API/wwwroot
        mkdir -p out/js
        mkdir -p out/hangfire
        cp -rf ./node_modules $(System.DefaultWorkingDirectory)/server/back-end/MyProject.API/out
        cp -rf $(System.DefaultWorkingDirectory)/server/back-end/MyProject.API/js/. $(System.DefaultWorkingDirectory)/server/back-end/MyProject.API/out/js
        cp -rf $(System.DefaultWorkingDirectory)/client/dist/ $(System.DefaultWorkingDirectory)/server/back-end/MyProject.API/wwwroot
        echo 'dotnet publish'
        dotnet publish -c Release -v d -o ./out
        cp -rf $(System.DefaultWorkingDirectory)/docker/. $(System.DefaultWorkingDirectory)/server/back-end/MyProject.API/out
        rm -f $(System.DefaultWorkingDirectory)/server/back-end/MyProject.API/wwwroot/index.html
    - publish: server/back-end/MyProject.API/out
      artifact: dotnet-backend-8

The container running the build (myproject-core-base-8-new) is based from FROM mcr.microsoft.com/dotnet/sdk:8.0-jammy

2

Answers


  1. Chosen as BEST ANSWER

    I determined the build or publish would use 10gb memory (locally) which exceeded the 7gb maximum of a microsoft hosted agent. I brought the memory down by adding -p:RunAnalyzers=false to the dotnet publish. This works for now but I would prefer to bring down the memory use of the build in other ways.


  2. The available memory for container is based on the available memory on the hosted agent machine.

    On Microsoft-hosted agents, each VM has a 14 GB of SSD disk space, however, you should not expect it will always have 10 GB of free disk space available for your pipelines to run.

    If the memory provide by Microsoft-hosted agents is not enough for your pipelines to run, it is recommended setting up the Self-hosted agents on your own machines. Since the machines is owned by yourself, you can expand more memory and other resource onto the machines based on your demands.


    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search