airflow mssql_to_gcs 源码

  • 2022-10-20
  • 浏览 (279)

airflow mssql_to_gcs 代码

文件路径:/airflow/providers/google/cloud/transfers/mssql_to_gcs.py

#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements.  See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership.  The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License.  You may obtain a copy of the License at
#
#   http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied.  See the License for the
# specific language governing permissions and limitations
# under the License.
"""MsSQL to GCS operator."""
from __future__ import annotations

import datetime
import decimal

from airflow.providers.google.cloud.transfers.sql_to_gcs import BaseSQLToGCSOperator
from airflow.providers.microsoft.mssql.hooks.mssql import MsSqlHook


class MSSQLToGCSOperator(BaseSQLToGCSOperator):
    """Copy data from Microsoft SQL Server to Google Cloud Storage
    in JSON, CSV or Parquet format.

    :param mssql_conn_id: Reference to a specific MSSQL hook.

    **Example**:
        The following operator will export data from the Customers table
        within the given MSSQL Database and then upload it to the
        'mssql-export' GCS bucket (along with a schema file). ::

            export_customers = MsSqlToGoogleCloudStorageOperator(
                task_id='export_customers',
                sql='SELECT * FROM dbo.Customers;',
                bucket='mssql-export',
                filename='data/customers/export.json',
                schema_filename='schemas/export.json',
                mssql_conn_id='mssql_default',
                gcp_conn_id='google_cloud_default',
                dag=dag
            )

    .. seealso::
        For more information on how to use this operator, take a look at the guide:
        :ref:`howto/operator:MSSQLToGCSOperator`

    """

    ui_color = '#e0a98c'

    type_map = {3: 'INTEGER', 4: 'TIMESTAMP', 5: 'NUMERIC'}

    def __init__(self, *, mssql_conn_id='mssql_default', **kwargs):
        super().__init__(**kwargs)
        self.mssql_conn_id = mssql_conn_id

    def query(self):
        """
        Queries MSSQL and returns a cursor of results.

        :return: mssql cursor
        """
        mssql = MsSqlHook(mssql_conn_id=self.mssql_conn_id)
        conn = mssql.get_conn()
        cursor = conn.cursor()
        cursor.execute(self.sql)
        return cursor

    def field_to_bigquery(self, field) -> dict[str, str]:
        return {
            'name': field[0].replace(" ", "_"),
            'type': self.type_map.get(field[1], "STRING"),
            'mode': "NULLABLE",
        }

    @classmethod
    def convert_type(cls, value, schema_type, **kwargs):
        """
        Takes a value from MSSQL, and converts it to a value that's safe for
        JSON/Google Cloud Storage/BigQuery.
        Datetime, Date and Time are converted to ISO formatted strings.
        """
        if isinstance(value, decimal.Decimal):
            return float(value)
        if isinstance(value, (datetime.date, datetime.time)):
            return value.isoformat()
        return value

相关信息

airflow 源码目录

相关文章

airflow init 源码

airflow adls_to_gcs 源码

airflow azure_fileshare_to_gcs 源码

airflow bigquery_to_bigquery 源码

airflow bigquery_to_gcs 源码

airflow bigquery_to_mssql 源码

airflow bigquery_to_mysql 源码

airflow calendar_to_gcs 源码

airflow cassandra_to_gcs 源码

airflow facebook_ads_to_gcs 源码

0  赞