30 May 2022 |
| 레몬버터구이 changed their display name from _slack_kubeflow_U03EE7VFCDN to 레몬버터구이. | 02:19:24 |
| 레몬버터구이 set a profile picture. | 02:19:26 |
Frédéric Kaczynski | The only thing I think could be used for your use-case is Workflow Events (https://argoproj.github.io/argo-workflows/workflow-events/), but it stays it should not be used for automation. :/ | 08:43:26 |
Cornelis Boon | Thanks! Will have a look, but I guess I’ll just write code that catches any errors and reports them as such before raising/throwing them further | 08:47:34 |
Chase Christensen | #define a function to add 2 numbers
def add(a: float, b: float) -> float:
return a + b
import kfp.components as comp
add_op = comp.func_to_container_op(add) # a factory function that you can use to create kfp.dsl.ContainerOp class instances for your pipeline
#function to write a float to a path
def write(path: str, x: float) -> str:
num = str(x)
f =open(path, "a")
f.write(num)
f.close
return path
write("demofile2.txt",1) # testing our function
def read(path: str):
f=open(path,"r")
print(f.read())
f.close
write_op = comp.func_to_container_op(write) # converting our function to a ContainerOp
read_op = comp.func_to_container_op(read) # converting our function to a ContainerOp
# new pipeline with volume being passed.
@dsl.pipeline(
name='Volume Pipeline',
description='simple pipeline to create a FRESH volume, add some numbers, and attach that volume on our different pods for their runs'
)
def volume_pipeline(
a='1',
b='2',
c='3',
):
vop = dsl.VolumeOp(
name="volume_creation",
resource_name="mypvc",
size="5Gi",
modes=dsl.VOLUME_MODE_RWM
)
add_task1 = add_op(a,3).add_pvolumes({"/mnt": vop.volume})
add_task2 = add_op(add_task1.output,b)
add_task3 = add_op(add_task2.output,c)
write_task=write_op("/mnt/output.txt",add_task3.output).add_pvolumes({"/mnt": vop.volume})
read_task=read_op(write_task.output).add_pvolumes({"/mnt": vop.volume}) | 11:50:09 |
laserK3000 | Thank for your help! | 13:00:57 |
Shrinath Suresh | The PR - https://github.com/kubeflow/pipelines/pull/7615 is still blocked due to an upstream build failure. Can someone help ? | 17:35:32 |
Yingding Wang | Is there an API to define init container in KFP python SDK? I would like to use init container to mount a minio bucket with fuse (https://github.com/minio/minfs) for the kf components. | 18:51:05 |
Yingding Wang | I found the API to define init container (https://kubeflow-pipelines.readthedocs.io/en/stable/source/kfp.dsl.html?highlight=pvolumes), but I think it will not work as i thought. | 20:07:04 |
31 May 2022 |
Steven Tobias | Redacted or Malformed Event | 02:09:01 |
Nicholas Kosteski | Has anyone had trouble getting their visualizations to render in the Run Outputs tab with kubeflow pipelines 1.8.1? If it matters, I'm using the standalone deployment and my pipeline code is using the v1 compiler. I found an open issue for v2 but no mentions of v1 breaking. Any insight/experience from anyone would be super helpful! | 14:51:01 |
Joseph Olaide | Hi Nicholas, what are your issues? | 14:52:39 |
Joseph Olaide | Are you passing the visualization metadata to the right path? Nicholas Kosteski | 14:53:19 |
Nicholas Kosteski | I've attached some screenshots, the markdown shows up accurately in the pod visualizations tab but when I go to the Run Output tab nothing is show up. The data is inline, and the artifacts seem to be generated fine. | 14:56:13 |
Nicholas Kosteski | Here's the test code:
from typing import NamedTuple
import kfp
@kfp.components.create_component_from_func
def metadata_and_metrics() -> NamedTuple(
"Outputs",
[("mlpipeline_ui_metadata", "UI_metadata"), ("mlpipeline_metrics", "Metrics")],
):
metadata = {
"outputs": [
{"storage": "inline", "source": "this should be bold", "type": "markdown"}
]
}
metrics = {
"metrics": [
{
"name": "train-accuracy",
"numberValue": 0.9,
},
{
"name": "test-accuracy",
"numberValue": 0.7,
},
]
}
from collections import namedtuple
import json
return namedtuple("output", ["mlpipeline_ui_metadata", "mlpipeline_metrics"])(
json.dumps(metadata), json.dumps(metrics)
)
@kfp.dsl.pipeline()
def pipeline():
metadata_and_metrics()
| 14:57:02 |
Joseph Olaide | If I get you, it's not displaying in the pod visualization?
However in the pipeline's run output it shows the metrics. | 15:01:09 |
Nicholas Kosteski | Yeah, in the version of kfp we run in prod currently those pod visualizations are seen in the Run Output tab. However it seems something might have broken between v1.0.4 and v1.8.1 (I know big jump haha) | 15:03:09 |
Irvin Tang | hey sorry for the late reply. forgot how this was resolved, but i’m not seeing this error anymore | 15:11:03 |
Joseph Olaide | I don't know if it's a version issue.
My pipeline scalar metrics appears in my run output
While my pipeline visualizations appears in the pipeline step or pod they were created in.
This is also how it appears in the kfp sdk 1 pipeline output viewer guide. | 15:22:53 |
Nicholas Kosteski | Are you referring to this guide? | 15:24:46 |
Joseph Olaide | Yes | 15:26:41 |
Irvin Tang | hi all. currently, when querying for pipelines using the KFP client, the pipeline object that gets returned has an attribute called parameters which is a list of kfp_server_api.models.api_parameter.ApiParameter objects. These objects only specify the parameter name and value. Is there a way to retrieve the expected type for the pipeline parameter as well? | 15:27:31 |
Nicholas Kosteski | It does mention the visualizations should be available on the Run Outputs though:
The Run output tab shows the visualizations for all pipeline steps in the selected run. To open the tab in the Kubeflow Pipelines UI: | 15:28:41 |
Nicholas Kosteski | I just realized there was a link to the code that is meant to load up the visualizations in that issue. I'll try taking a look to see if something happened to where its not being labeled correctly in the instance I have up | 15:31:19 |
1 Jun 2022 |
| _slack_kubeflow_U03HM661ZPY joined the room. | 11:33:55 |
droctothorpe | Is there any particular reason you can’t delete, as opposed to archive, experiments in the web GUI? | 14:38:07 |
Nicholas Kosteski | Looks like I finally found something that’s different between the two versions I have. The actual argo_artifact is being created differently in the version that works the artifact is:
{
"name": "mlpipeline-ui-metadata",
"optional": true,
"path": "/tmp/outputs/mlpipeline_ui_metadata/data",
"s3": {
"accessKeySecret": {
"key": "accesskey",
"name": "mlpipeline-minio-artifact"
},
"bucket": "mlpipeline",
"endpoint": "minio-service.kubeflow:9000",
"insecure": true,
"key": "artifacts/allocation-tx-5gpvh/allocation-tx-5gpvh-170870440/mlpipeline-ui-metadata.tgz",
"secretKeySecret": {
"key": "secretkey",
"name": "mlpipeline-minio-artifact"
}
}
}
in the one that doesn’t work its created as the argo_artifact :
{
"name": "mlpipeline-ui-metadata",
"optional": true,
"path": "/tmp/outputs/mlpipeline_ui_metadata/data",
"s3": {
"key": "artifacts/markdown-pipeline-ldvxr/2022/06/01/markdown-pipeline-ldvxr-4134324342/mlpipeline-ui-metadata.tgz"
}
}
| 15:58:17 |
Nicholas Kosteski | It seems like maybe my workflow-controller-configmap might have some kind of issue | 16:00:20 |
Joseph Olaide | from typing import NamedTuple
import kfp
from kfp.components import create_component_from_func
@create_component_from_func
def produce_markdown() -> NamedTuple('Outputs', [('MLPipeline_UI_metadata', 'UI_metadata')]):
import sys, json, subprocess
subprocess.run([sys.executable, '-m', 'pip', 'install','pandas'])
import pandas as pd
matrix = [
['y', 'y', 10],
['y', 'n', 9],
['n', 'y', 6],
['n', 'y', 7]
]
df = pd.DataFrame(matrix,columns=['target','predicted','count'])
metadata = {
"outputs": [
{
"type": "confusion_matrix",
"format": "csv",
"schema": [
{
"name": "target",
"type": "CATEGORY"
},
{
"name": "predicted",
"type": "CATEGORY"
},
{
"name": "count",
"type": "NUMBER"
}
],
"source": df.to_csv(header=False, index=False),
"storage": "inline",
"labels": [
"yummy",
"not yummy"
]
}
]
}
return [json.dumps(metadata)]
def my_pipeline():
produce_markdown()
kfp.Client().create_run_from_pipeline_func(my_pipeline, arguments={}) | 16:30:39 |
Joseph Olaide | Hi Nicholas Kosteski, this sample code above creates a run output artifact and a visualization in the pod.
I hope this is helpful | 16:33:10 |