如何通过从一个表中提取最大数字然后将其提取到另一个表中来连接两个 SQL 表?
How to join two SQL tables by extracting maximum numbers from one then into another?
正如其他人评论的那样,我现在要添加一些代码:
进口tables
table3
Case No. is the primary key. Each report date shows one patient. Depending on if the patient is import or local, the cumulative column increases. You can see some days there are no cases so the date like 25/01/2020 is skipped
table2
Report date has no duplicate.
现在,我想加入table。示例结果在这里:
enter image description here
每个日期的最大累计加入新的table。所以虽然table3的26/01/2020显示了从6、7到8的增加,但我只想要那里最高的累计数。
感谢您让我知道如何改进我之前的查询。你的意见对我很有帮助。
我已经尝试用 Gordon Linoff 替换真实姓名(最初我省略了,因为我认为它们不明确)。
他的代码如下(我点赞):
SELECT t3.`Report date`,
max(max(t3.cumulative_local)) over (order by t3.`Report date`),
max(max(t3.cumulative_import)) over (order by t3.`Report date`)
from table3 t3 left join
table2 t2
using (`Report date`)
group by t2.`Report date`;
但是我得到一个错误
Error Code: 1055. Expression #1 of SELECT list is not in GROUP BY clause and contains nonaggregated column 'new.t3.Report date' which is not functionally dependent on columns in GROUP BY clause; this is incompatible with sql_mode=only_full_group_by
反正我现在正在试验。两个答案都有帮助。如果您知道如何修复 1055,请告诉我,或者您是否可以提出其他解决方案。谢谢
我不明白为什么 table1 上有 cumulA 和 cumulB。我想这将是每天存储 Max cumulA 和 cumulB。
您必须首先 self-join table2 找到每个日期的最大值(使用 GROUP BY 日期):
SELECT t2.id, t2.date, cA
FROM t2
JOIN (
SELECT id, MAX(cumulA) AS cA, date AS d2
FROM t2
GROUP BY d2
) AS td
ON t2.id=td.id
AND t2.date=d2
ORDER BY t2.date
之后,您将左表 1 加入到 self-join 表 2 的结果中,得到每一天。
SELECT * FROM `t1` LEFT JOIN t2 ON t1.date = t2.date ORDER BY t1.date
这是 2 个路口的融合:
SELECT * FROM `t1` LEFT JOIN (
SELECT t2.id, t2.date, cA
FROM t2
JOIN (
SELECT id, MAX(cumulA) AS cA, date AS d2
FROM t2
GROUP BY d2
) AS td
ON t2.id=td.id
AND t2.date=d2
ORDER BY t2.date
) AS tt
ON t1.date = tt.date ORDER BY t1.date
你对 cumulB 做同样的事情。
在(我想)之后,您将结果插入到表 1 中。
希望我回答了你的问题。
很好的延续。
_泰迪_
我认为您只需要聚合和 window 函数:
select t1.date,
max(max(cumulativea)) over (order by t1.date),
max(max(cumulativeb)) over (order by t1.date)
from table1 t1 left join
table2 t2
on t1.date = t2.date
group by t1.date;
这是每个日期前两列的最大值,我认为这就是您要描述的内容。
正如其他人评论的那样,我现在要添加一些代码:
进口tables
table3 Case No. is the primary key. Each report date shows one patient. Depending on if the patient is import or local, the cumulative column increases. You can see some days there are no cases so the date like 25/01/2020 is skipped
table2 Report date has no duplicate.
现在,我想加入table。示例结果在这里: enter image description here
每个日期的最大累计加入新的table。所以虽然table3的26/01/2020显示了从6、7到8的增加,但我只想要那里最高的累计数。
感谢您让我知道如何改进我之前的查询。你的意见对我很有帮助。
我已经尝试用 Gordon Linoff 替换真实姓名(最初我省略了,因为我认为它们不明确)。
他的代码如下(我点赞):
SELECT t3.`Report date`,
max(max(t3.cumulative_local)) over (order by t3.`Report date`),
max(max(t3.cumulative_import)) over (order by t3.`Report date`)
from table3 t3 left join
table2 t2
using (`Report date`)
group by t2.`Report date`;
但是我得到一个错误
Error Code: 1055. Expression #1 of SELECT list is not in GROUP BY clause and contains nonaggregated column 'new.t3.Report date' which is not functionally dependent on columns in GROUP BY clause; this is incompatible with sql_mode=only_full_group_by
反正我现在正在试验。两个答案都有帮助。如果您知道如何修复 1055,请告诉我,或者您是否可以提出其他解决方案。谢谢
我不明白为什么 table1 上有 cumulA 和 cumulB。我想这将是每天存储 Max cumulA 和 cumulB。
您必须首先 self-join table2 找到每个日期的最大值(使用 GROUP BY 日期):
SELECT t2.id, t2.date, cA
FROM t2
JOIN (
SELECT id, MAX(cumulA) AS cA, date AS d2
FROM t2
GROUP BY d2
) AS td
ON t2.id=td.id
AND t2.date=d2
ORDER BY t2.date
之后,您将左表 1 加入到 self-join 表 2 的结果中,得到每一天。
SELECT * FROM `t1` LEFT JOIN t2 ON t1.date = t2.date ORDER BY t1.date
这是 2 个路口的融合:
SELECT * FROM `t1` LEFT JOIN (
SELECT t2.id, t2.date, cA
FROM t2
JOIN (
SELECT id, MAX(cumulA) AS cA, date AS d2
FROM t2
GROUP BY d2
) AS td
ON t2.id=td.id
AND t2.date=d2
ORDER BY t2.date
) AS tt
ON t1.date = tt.date ORDER BY t1.date
你对 cumulB 做同样的事情。 在(我想)之后,您将结果插入到表 1 中。
希望我回答了你的问题。
很好的延续。
_泰迪_
我认为您只需要聚合和 window 函数:
select t1.date,
max(max(cumulativea)) over (order by t1.date),
max(max(cumulativeb)) over (order by t1.date)
from table1 t1 left join
table2 t2
on t1.date = t2.date
group by t1.date;
这是每个日期前两列的最大值,我认为这就是您要描述的内容。