我该如何在Spark SQL中执行冗长的、多行的Hive查询?例如下面的查询:
val sqlContext = new HiveContext (sc)
val result = sqlContext.sql ("
select ...
from ...
");
我该如何在Spark SQL中执行冗长的、多行的Hive查询?例如下面的查询:
val sqlContext = new HiveContext (sc)
val result = sqlContext.sql ("
select ...
from ...
");
使用"""替代,例如:
Use """ instead, so for example
val results = sqlContext.sql ("""
select ....
from ....
""");
或者,如果您想格式化代码,请使用:
val results = sqlContext.sql ("""
|select ....
|from ....
""".stripMargin);
您可以在SQL代码的开头/结尾使用三引号,或在每行末尾使用反斜杠。
val results = sqlContext.sql ("""
create table enta.scd_fullfilled_entitlement as
select *
from my_table
""");
results = sqlContext.sql (" \
create table enta.scd_fullfilled_entitlement as \
select * \
from my_table \
")
val query = """(SELECT
a.AcctBranchName,
c.CustomerNum,
c.SourceCustomerId,
a.SourceAccountId,
a.AccountNum,
c.FullName,
c.LastName,
c.BirthDate,
a.Balance,
case when [RollOverStatus] = 'Y' then 'Yes' Else 'No' end as RollOverStatus
FROM
v_Account AS a left join v_Customer AS c
ON c.CustomerID = a.CustomerID AND c.Businessdate = a.Businessdate
WHERE
a.Category = 'Deposit' AND
c.Businessdate= '2018-11-28' AND
isnull(a.Classification,'N/A') IN ('Contractual Account','Non-Term Deposit','Term Deposit')
AND IsActive = 'Yes' ) tmp """
val results = sqlContext.sql("select .... " +
" from .... " +
" where .... " +
" group by ....
");
val selectElements = Seq("a","b","c")
val builder = StringBuilder.newBuilder
builder.append("select ")
builder.append(selectElements.mkString(","))
builder.append(" where d<10")
val results = sqlContext.sql(builder.toString())
在三引号内编写您的 SQL 代码,例如 """ sql 代码 """
df = spark.sql(f""" select * from table1 """)