我将尝试通过Azure DevOps管道将文件复制到Azure Databricks DBFS。以下是我使用的yml文件片段:
在调用 databricks fs cp 命令的阶段,我遇到了以下错误:
stages:
- stage: MYBuild
displayName: "My Build"
jobs:
- job: BuildwhlAndRunPytest
pool:
vmImage: 'ubuntu-16.04'
steps:
- task: UsePythonVersion@0
displayName: 'Use Python 3.7'
inputs:
versionSpec: '3.7'
addToPath: true
architecture: 'x64'
- script: |
pip install pytest requests setuptools wheel pytest-cov
pip install -U databricks-connect==7.3.*
displayName: 'Load Python Dependencies'
- checkout: self
persistCredentials: true
clean: true
- script: |
echo "y
$(databricks-host)
$(databricks-token)
$(databricks-cluster)
$(databricks-org-id)
8787" | databricks-connect configure
databricks-connect test
env:
databricks-token: $(databricks-token)
displayName: 'Configure DBConnect'
- script: |
databricks fs cp test-proj/pyspark-lib/configs/config.ini dbfs:/configs/test-proj/config.ini
在调用 databricks fs cp 命令的阶段,我遇到了以下错误:
/home/vsts/work/_temp/2278f7d5-1d96-4c4e-a501-77c07419773b.sh: line 7: databricks: command not found
然而,当我运行databricks-connect test
时,它能够成功执行该命令。如果我在某个地方漏掉了一些步骤,请帮忙解决。