-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add caseload size calculation #44
Comments
PS I banged out the synthetic data, if anybody has improvements or comments I'd be glad to hear them |
This is my attempt using SQL: |
This works for me: `-- Caseload declare @StartDate date = '2022-08-31' ;with cteCaseload as ( select c.IDPatient ` |
This one for a dplyr version... It uses
|
you could treat this as a continous function by pivotting the dates into a single column, then using There is one big caveat if you were using this, you would need to write a query that extractted date before your period of interest, e.g. library(tidyverse)
add_end_row <- function(.data, end_date = NULL) {
if (is.null(end_date)) {
end_date <- max(.data$date)
}
bind_rows(.data, summarise(.data, date = end_date, across(count, last)))
}
test_frame |>
pivot_longer(ends_with("date"), names_to = "date_type", values_to = "date") |>
drop_na(date) |>
mutate(count = ifelse(date_type == "referral_date", 1, -1)) |>
arrange(date) |>
select(team_desc, date, count) |>
group_by(team_desc) |>
mutate(across(count, cumsum)) |>
add_end_row(Sys.Date()) |>
ggplot(aes(date, count, colour = team_desc)) +
geom_step() |
The new package {ivs} https://github.com/DavisVaughan/ivs might also be useful for this YouTube 50:52 |
Using this script to produce example data (or submitting your own example data) please add a function that calculates the caseload size per team (or teams) by day (or range of days).
See the README for an explanation of the data.
We would love multiple solutions to this please- tidyverse, base, data.table, SQL, Python, you name it 🙂
The text was updated successfully, but these errors were encountered: