Azure Synapse Notebooks Vs Azure Databricks notebooks
I was going through the features of Azure Synapse Notebooks Vs Azure Databricks notebooks.
- Are there any major differences between these apart from the component they belong to ?
- Are there any scenarios where one is more appropriate over other?
do you know?
how many words do you know
See also questions close to this topic
-
Deploy VueJS + API app to Azure Static Web App with Gitlab doesn't create functions
I've started creating a small application that will use VueJS as a frontend with Azure Functions as the backend. I was looking at using Azure Static Web Apps to host both components for the application and Gitlab to store / deploy the app.
Everything but the creation of the Azure functions works. Following https://docs.microsoft.com/en-us/azure/static-web-apps/gitlab?tabs=vue
The output from the deploy step, listed below is:
App Directory Location: '/builds/*/valhalla/valhalla-client/dist/spa' was found. Api Directory Location: '/builds/*/valhalla/valhalla-api/dist' was found. Looking for event info Could not get event info. Proceeding Starting to build app with Oryx Azure Static Web Apps utilizes Oryx to build both static applications and Azure Functions. You can find more details on Oryx here: https://github.com/microsoft/Oryx ---Oryx build logs--- Operation performed by Microsoft Oryx, https://github.com/Microsoft/Oryx You can report issues at https://github.com/Microsoft/Oryx/issues Oryx Version: 0.2.20220131.3, Commit: ec344c058843461525ff03b46031553b6e15a47a, ReleaseTagName: 20220131.3 Build Operation ID: |qAffRWArEg8=.deee9498_ Repository Commit : 7cdd5b61f956e6cb8459b13a42af363c4440a97b Detecting platforms... Could not detect any platform in the source directory. Error: Could not detect the language from repo. ---End of Oryx build logs--- Oryx was unable to determine the build steps. Continuing assuming the assets in this folder are already built. If this is an unexpected behavior please contact support. Finished building app with Oryx Starting to build function app with Oryx ---Oryx build logs--- Operation performed by Microsoft Oryx, https://github.com/Microsoft/Oryx You can report issues at https://github.com/Microsoft/Oryx/issues Oryx Version: 0.2.20220131.3, Commit: ec344c058843461525ff03b46031553b6e15a47a, ReleaseTagName: 20220131.3 Build Operation ID: |NGXLP5bVBRk=.705477f6_ Repository Commit : 7cdd5b61f956e6cb8459b13a42af363c4440a97b Detecting platforms... Could not detect any platform in the source directory. Error: Could not detect the language from repo. ---End of Oryx build logs--- Oryx was unable to determine the build steps. Continuing assuming the assets in this folder are already built. If this is an unexpected behavior please contact support. [WARNING] The function language could not be detected. The language will be defaulted to node. Function Runtime Information. OS: linux, Functions Runtime: ~3, node version: 12 Finished building function app with Oryx Zipping Api Artifacts Done Zipping Api Artifacts Zipping App Artifacts Done Zipping App Artifacts Uploading build artifacts. Finished Upload. Polling on deployment. Status: InProgress. Time: 0.1762737(s) Status: InProgress. Time: 15.3950401(s) Status: Succeeded. Time: 30.5043965(s) Deployment Complete :) Visit your site at: https://polite-pebble-0dc00000f.1.azurestaticapps.net Thanks for using Azure Static Web Apps! Exiting Cleaning up project directory and file based variables 00:00 Job succeeded
The deploy step appears to have succeeded, and the frontend is deployed, but there are no Azure Functions showing up in this Static Web App. Is something missed here? So far, the Azure Functions I have are the boiler-plate from instantiating a new Azure Function folder.
image: node:latest variables: API_TOKEN: $DEPLOYMENT_TOKEN APP_PATH: '$CI_PROJECT_DIR/valhalla-client/dist/spa' API_PATH: '$CI_PROJECT_DIR/valhalla-api/dist' stages: - install_api - build_api - install_client - build_client - deploy install_api: stage: install_api script: - cd valhalla-api - npm ci artifacts: paths: - valhalla-api/node_modules/ cache: key: node paths: - valhalla-api/node_modules/ only: - master install_client: stage: install_client script: - cd valhalla-client - npm ci artifacts: paths: - valhalla-client/node_modules/ cache: key: node paths: - valhalla-client/node_modules/ only: - master build_api: stage: build_api dependencies: - install_api script: - cd valhalla-api - npm install -g azure-functions-core-tools@3 --unsafe-perm true - npm run build artifacts: paths: - valhalla-api/dist cache: key: build_api paths: - valhalla-api/dist only: - master needs: - job: install_api artifacts: true optional: true build_client: stage: build_client dependencies: - install_client script: - cd valhalla-client - npm i -g @quasar/cli - quasar build artifacts: paths: - valhalla-client/dist/spa cache: key: build_client paths: - valhalla-client/dist/spa only: - master needs: - job: install_client artifacts: true optional: true deploy: stage: deploy dependencies: - build_api - build_client image: registry.gitlab.com/static-web-apps/azure-static-web-apps-deploy script: - echo "App deployed successfully." only: - master
-
How to authorize azure container registry requests from .NET CORE C#
I have a web application which creates ContainerInstances, I have specific container registry images I want to use. As a result, I use this code to get my azure container registry
IAzure azure = Azure.Authenticate($"{applicationDirectory}/Resources/my.azureauth").WithDefaultSubscription(); IRegistry azureRegistry = azure.ContainerRegistries.GetByResourceGroup("testResourceGroup", "testContainerRegistryName");
I get this error when the second line of code is hit
The client 'bc8fd78c-2b1b-4596-827e-6a3c918b7c17' with object id 'bc8fd78c-2b1b-4596-827e-6a3c918b7c17' does not have authorization to perform action 'Microsoft.ContainerRegistry/registries/read' over scope '/subscriptions/506b787d-83ef-426a-b7b8-7bfcdd475855/resourceGroups/testapp-live/providers/Microsoft.ContainerRegistry/registries/testapp' or the scope is invalid. If access was recently granted, please refresh your credentials.
I literally have no idea what to do about this. I have seen so many articles talking about Azure AD and giving user roles and stuff. Can someone please walk me step by step how to fix this? I REALLY appreciate the help. Thanks.
I cannot find any client under that object ID so perfectly fine starting from scratch again with a better understanding of what I am doing.
-
Load Data Using Azure Batch Service and Spark Databricks
I have File Azure Blob Storage that I need to load daily into the Data Lake. I am not clear on which approach I should use(Azure Batch Account, Custom Activity or Databricks, Copy Activity ). Please advise me.
-
How do I Insert Overwrite with parquet format?
I am have two parquet file in azure data lake gen2 I want to Insert Overwrite onw with other. I was trying the same in azure data bricks by doing below.
Reading from data lake to azure.
val cs_mbr_prov_1=spark.read.format("parquet").option("header","true").load(s"${SourcePath}/cs_mbr_prov") cs_mbr_prov_1.createOrReplaceTempView("cs_mbr_prov") val cs_mbr_prov_stg=spark.read.format("parquet").option("header","true").load(s"${SourcePath}/cs_mbr_prov_stg_1") cs_mbr_prov_stg.createOrReplaceTempView("cs_mbr_prov_stg")
var res =spark.sql(s"INSERT OVERWRITE TABLE cs_br_prov " + s"SELECT NAMED_STRUCT('IND_ID',stg.IND_ID,'CUST_NBR',stg.CUST_NBR,'SRC_ID',stg.SRC_ID, "+ s"'SRC_SYS_CD',stg.SRC_SYS_CD,'OUTBOUND_ID',stg.OUTBOUND_ID,'OPP_ID',stg.OPP_ID, " + s"'CAMPAIGN_CD',stg.CAMPAIGN_CD,'TREAT_KEY',stg.TREAT_KEY,'PROV_KEY',stg.PROV_KEY, " + s"'INSERTDATE',stg.INSERTDATE,'UPDATEDATE',stg.UPDATEDATE,'CONTACT_KEY',stg.CONTACT_KEY) AS key, "+ s"stg.MEM_KEY,stg.INDV_ID,stg.MBR_ID,stg.OPP_DT,stg.SEG_ID,stg.MODA,stg.E_KEY, " + s"stg.TREAT_RUNDATETIME from cs_br_prov_stg stg")
Error I am getting:
AnalysisException: unknown requires that the data to be inserted have the same number of columns as the target table: target table has 20 column(s) but the inserted data has 9 column(s), including 0 partition column(s) having constant value(s).
But having same no.of colums in both.
-
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 4 in Azure Synapse
I have a Spotify CSV file in my Azure Data Lake. I am trying to create external table you SQL serverless pool in Azure Synapse.
I am getting the below error message
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 4 (Track_popularity) in data file https://test.dfs.core.windows.net/data/folder/updated.csv.
I am using the below script
IF NOT EXISTS (SELECT * FROM sys.external_file_formats WHERE name = 'SynapseDelimitedTextFormat') CREATE EXTERNAL FILE FORMAT [SynapseDelimitedTextFormat] WITH ( FORMAT_TYPE = DELIMITEDTEXT , FORMAT_OPTIONS ( FIELD_TERMINATOR = ',', USE_TYPE_DEFAULT = FALSE )) GO IF NOT EXISTS (SELECT * FROM sys.external_data_sources WHERE name = 'test.dfs.core.windows.net') CREATE EXTERNAL DATA SOURCE [test.dfs.core.windows.net] WITH ( LOCATION = 'abfss://data@test.dfs.core.windows.net' ) GO CREATE EXTERNAL TABLE updated ( [Artist] nvarchar(4000), [Track] nvarchar(4000), [Track_id] nvarchar(4000), [Track_popularity] bigint, [Artist_id] nvarchar(4000), [Artist_Popularity] bigint, [Genres] nvarchar(4000), [Followers] bigint, [danceability] float, [energy] float, [key] bigint, [loudness] float, [mode] bigint, [speechiness] float, [acousticness] float, [instrumentalness] float, [liveness] float, [valence] float, [tempo] float, [duration_ms] bigint, [time_signature] bigint ) WITH ( LOCATION = 'data/updated.csv', DATA_SOURCE = [data_test_dfs_core_windows_net], FILE_FORMAT = [SynapseDelimitedTextFormat] ) GO SELECT TOP 100 * FROM dbo.updated GO
Below is the data sample
My CSV is utf-8 encoding. Not sure what is the issue. The error shows column (Track_popularity). Please advise
-
How to get the KeyVault name in the notebook from the keyvault link in Synapse?
How to get the KeyVault name in the notebook from the keyvault link in Synapse?
I need the KeyVault name to pass to the TokenLibrary.
TokenLibrary.getSecret(keyVaultName,"MyConnectionString", "AzureKeyVaultLink")