Skip to main content
Search

    Search Microsoft Fabric Ideas

    Vote 8
    Kevin Billings profile image

    Kevin Billings on 11/17/2023 6:33:51 PM

    Synapse

    Allow MSFabric Data Engineering Notebooks to Connect to ADLS Gen2 via private endpoint or make MSFabric a trusted Microsoft resource on the ADLS Gen2 Storage Account

    allowing notebooks to read/write to ADLS gen2 storage accounts that are secured by a private endpoint. Allowing Lakehouse to create shortcuts to ADLS Gen2 storage accounts that are secured by the private endpoint. In both scenarios the storage account has "deny public network access" set to true.

    Planned

    Administrator

    Reviewing internally.

    Vote 10
    Khadar Basha profile image

    Khadar Basha on 6/17/2021 9:47:39 PM

    Power BI

    Configuring dataflow storage to use Azure Data Lake Gen 2. Currently not supporting ADLS Gen2 Storage Accounts behind a firewall.

    Currently not supporting ADLS Gen2 Storage Accounts behind a firewall. When will this be supported for ADLS Gen2 Storage Accounts behind firewall for Configuring dataflow storage to use Azure Data Lake Gen 2?. Please help required here.

    Needs Votes
    Vote 4
    ryoma nagata profile image

    ryoma nagata on 4/14/2020 8:57:05 AM

    Power BI

    Connecting to adls gen2 of external AAD

    Currently , when connecting to the ADLS GEN2 of the external AAD, the ADLS is searched from the AAD to which the user who is going to connect belongs.So, it cannot connect to the ADLS GEN2 of the external . It would be nice to be able to connect to ADLS GEN2 of the external AAD.

    Needs Votes
    Vote 3
    Avadh Kishore profile image

    Avadh Kishore on 7/21/2023 4:38:21 PM

    Core

    Connectivity to Azure storage accounts (ADLS Gen2/Blob) using SPN in Fabric

    Connectivity to Azure storage accounts (ADLS Gen2/Blob) using SPN in Fabric.

    Currently it is being supported in Shortcuts (via Connections), however, there is no way to access current data paths as being used in Databricks or Synapse.

    This could lead to high amount of effort in migration of current systems to Microsoft Fabric capacity.


    Example of current data paths 'abfss://{containername}@{storageaccount}.dfs.core.windows.net/sample.parquet'

    New
    Vote 3
    Miles Cole profile image

    Miles Cole on 4/5/2019 10:37:56 PM

    Power BI

    Allow Shared Capacities (Pro) to store data in ADLS gen2 direct from on-premises sources

    "On-premises data sources, in Power BI Shared capacities, are not supported in dataflows stored in your organization’s Azure Data Lake Storage Gen2." We would like to be able to subscribe to ADLS gen2 and use dataflows to store data there directly from our On-Premises sources. We currently can't do this which is a big problem because it means we can only bring >10GB of on-premises data into shared capacity dataflows. From there we can do additional ETL and store that in ADLS gen2 but we are limited in the amount of data we can stage into shared capacity dataflows. You shouldn't need premium just to directly put on-premises data into ADLS when you are paying for the storage separately.

    Needs Votes
    Vote 1
    Raymond Law profile image

    Raymond Law on 8/2/2023 3:32:51 PM

    Data Factory

    OneLake URL Validation error in Copy Activity ADLS Gen2 connector

    I am writing to report URL validation issue per the Microsoft Fabric Blog (link and quoted are shown at the end) suggested. Below is the URL validation error message from the Copy Activity (ADLS Gen2 connector) in Data Factory in Fabric.



    Failed to load


    The data store endpoint is not supported in 'AzureBlobFS' connector. Error message : 'The domain of this endpoint is not in allow list. Original endpoint: 'onelake.dfs.fabric.microsoft.com'' The domain of this endpoint is not in allow list. Original endpoint: 'onelake.dfs.fabric.microsoft.com'



    Also, it would be appreciated if ADLS Gen2 connector could support reading and writing Delta Parquet file format in Copy Activity.

    They are some of the key missing puzzles to make OneLake truly supported by ADLS Gen2 connector in Data Pipeline. It could help implementing an end-to-end data pipeline to work across workspaces. For example, I need an end-to-end data pipeline to extract data from Azure SQL Managed Instance into different workspaces. It consists a Bronze Lakehouse Workspace for raw files, a Silver Lakehouse Workspace for conformed files and a Gold Lakehouse workspace for curated files. This end-to-end pipeline located in Gold Lakehouse workspace. It should copy data from source Azure SQL Managed Instance to Bronze Lakehouse (so, need a ADLS Gen2 connector to write across workspaces). Transform data from Bronze to Silver by a notebook activity and then a data flow Gen2 activity for implementing business transformation logic. The only missing puzzle is the ADLS Gen2 connector to support OneLake (URL and delta parquet format).


    Reference:

    https://blog.fabric.microsoft.com/en-ca/blog/connecting-to-onelake

    "To help ensure calls are only made to authorized domains, some tools validate storage URLs to ensure they match known endpoints. As OneLake has a distinct endpoint from ADLS Gen2, these tools will block calls to OneLake or not know to use ADLS Gen2 protocols with OneLake. One way around this is to use custom endpoints (as in the Powershell example above). Otherwise, it’s often a simple fix to add OneLake’s endpoint (fabric.microsoft.com). If you find a URL validation issue or any other problems connecting to OneLake, please let us know at aka.ms/fabricideas!"

    Needs Votes

    Administrator

    Could you provide details on what validation error you hit? 

    Vote 1
    Rob Martin profile image

    Rob Martin on 8/17/2020 8:48:37 AM

    Power BI

    When using ADLS source data, support ACL's for external user (guest) accounts at the folder level

    Considering the scenario where we have a single ADLS Gen2 storage warehouse for a multi-tenant system which also includes general data. In the ADLS is data collated from the system and, tenant specific data sources written to a per-tenant folder. Permissions are applied to the folder level, restricting access to the designated tenant, however for PowerBI to be able to ingest this data, permissions need to be assigned at the storage level making the calling user an owner of the storage. This effectively undermines the file level permissions set to folder. Please consider enabling Access Control List support for ADLS Gen2 folders to facilitate finer grain security controls on the data.

    Needs Votes
    Vote 8
    Thomas Pagel profile image

    Thomas Pagel on 10/31/2019 7:28:25 PM

    Power BI

    Add "Azure Data Lake Storage Gen2" as a Data Source in DataFlows

    Couldn't believe that this is true: There's no connector in DataFlows to read from files stored in Azure Data Lake Storage Gen2. You can connect to other DataFlows but if you have a plain and simple file stored on the Data Lake (so no CDM format) you're stuck... Maybe you can grab it via the Blob API now being available but that doesn't provide you the Single-Sign-On experience and fine-granular security you can define in ADLS Gen2

    Completed

    Administrator

    ADLS gen2 is supported in dataflows

    Vote 1
    Shivani NA profile image

    Shivani NA on 12/22/2021 1:53:29 PM

    Power BI

    Feature update in Tenant Level Storage using ADLS Gen 2 Account in Power BI

    We have created an Azure Data Lake Storage Gen2 account for our organization to store dataflows. We have selected tenant-level storage and workspace-level storage is disabled, then workspace admins can optionally configure their dataflows to use this connection. If we are using a Tenant Level storage then it should automatically use the ADLS Gen 2 account without configuring manually. Now we see that, for every workspace we need to storage manually even after selecting Tenant Level storage. We see that this is in preview feature and it would be difficult for users to select manually, if we have 100's of workspaces. We look forward to implement the feature in standard release of Azure Connections.

    New
    Vote 17
    Power BI User profile image

    Power BI User on 9/19/2019 8:44:13 AM

    Power BI

    Ease of Use for ADLS Gen 2 Connector

    I've given my client a bit of time to work with the ADLS G2 and Power BI and the feedback I'm getting from several of the users involved all comes down to the same issue. This is concerning the view the user gets of the data lake store when then are working through the 'get data' routine using the new connector for ADLS Gen2 2. They all want to see a hierarchical folder structure presented them as it is in Explorer on their own machines or in the Azure Storage Explorer. The data lake they are working with has a fairly complex folder and access security config so a flat view of this where the paths are shown in text is not helpful, particularly as the paths are mostly too long to be seen clearly in the dialog. They don't want to see the query editor unless they want to edit the query and in most cases they have no need to. These are business people who have the ability to manipulate and use information and want access to it in a way they are familiar with.

    Needs Votes