Log in to your harness - The Modern Software Delivery Platform® account to give feedback

Feature Requests

Anonymous

Feature Requests for Harness. Select 'Category' based on the module you are requesting the feature for.
Kubernetes API on catalog not pulling logs
Kubernetes API on catalog not pulling logs, when checked delegate logs, it is ignorning port Our config -> <<hostname>>:6443 port 2026-03-05 09:17:20.609 UTC [task-exec-50822] ERROR io.harness.network .Http - Could not connect [taskId=jsssGZO2Qm-NOMveZM7kYA-DEL, URL=https://<<hostname>>/api/v1/namespaces/customerfnbza-opm-stress/pods/opm-write-api-6c4cf56b48-99vp2/log?container=opm-write-api] java.util.concurrent.ExecutionException: java.net .ConnectException: Connection refused at com.google .common.util.concurrent.AbstractFuture.getDoneValue( AbstractFuture.java:596 ) at com.google .common.util.concurrent.AbstractFuture.get( AbstractFuture.java:555 ) at com.google .common.util.concurrent.AbstractFuture$TrustedFuture.get( AbstractFuture.java:111 ) at com.google .common.util.concurrent.Uninterruptibles.getUninterruptibly( Uninterruptibles.java:247 ) at com.google .common.cache.LocalCache$Segment.getAndRecordStats( LocalCache.java:2349 ) at com.google .common.cache.LocalCache$Segment.loadSync( LocalCache.java:2317 ) at com.google .common.cache.LocalCache$Segment.lockedGetOrLoad( LocalCache.java:2189 ) at com.google .common.cache.LocalCache$Segment.get( LocalCache.java:2079 ) at com.google .common.cache.LocalCache.get( LocalCache.java:4017 ) at com.google .common.cache.LocalCache.getOrLoad( LocalCache.java:4040 ) at com.google .common.cache.LocalCache$LocalLoadingCache.get( LocalCache.java:4989 ) at io.harness.network .Http.connectableHttpUrlWithHeaders( Http.java:363 ) at io.harness.delegate.task.executioncapability.HttpConnectionExecutionCapabilityCheck.performCapabilityCheckWithProto( HttpConnectionExecutionCapabilityCheck.java:78 ) at software.wings.delegatetasks.delegatecapability.CapabilityCheckController.lambda$validate$2( CapabilityCheckController.java:79 ) Details on request -> https://support.harness.io/hc/en-us/requests/105682
0
·
Internal Developer Portal
IDP Workflows: Clear stale form fields on Retry when dependency selection changes
Summary When re-running/retrying an IDP workflow that uses conditional inputs (e.g., fields that depend on connectionType), values from the previous selection remain in the workflow form data. This causes the Review step to show fields from both the old and new selections, which is confusing and can lead to incorrect submissions. Problem / Current behavior • Workflows with conditional parameters (dependencies / oneOf / if-then) behave correctly while filling the form. • However, on Retry/Re-run, if the user changes the selector (e.g., azureblob → documentdb), the previous branch fields remain in the underlying formData. • The Scaffolder Review page reads from the full formData, so it displays “old + new” fields together. Expected behavior On Retry/Re-run, when the user changes a dependency-driving field (like connectionType), fields not in the active schema branch should be removed from the form data so only the current branch appears on Review and is submitted. Why this matters (Impact) • Confusing UX during Review • Higher risk of submitting stale/irrelevant parameters • Common use-case for connection/config workflows (Azure Blob vs Cosmos DB vs SQL, etc.) Requested enhancement Enable (or provide a supported configuration option to enable) RJSF behavior that removes inactive fields from formData, such as: • omitExtraData • liveOmit so that hidden/inactive fields do not persist across Retry/Re-run and do not appear on Review.
0
·
Internal Developer Portal
Load More