To whom who may concern, I have dragged `Text Import` from the tab `Text Mining` in a `Diagram`. When I execute it, the error "Proc tgfilter reports Error: Cannot load the tktg extension. Check the tktg.dll." has shown. The screenshot is provided in the attachment. Some details: server: odaws01-apsel.oda.sas.com last run time: 5/18/24 5:38 am (UTC)
... View more
I have very little experience working with weights, so please correct me if my understanding is wrong.
I'm trying to create a summary table of unadjusted rates of quality of care between the TM and MA groups. I was able to produce a table with Proc Ttest and ODS. However, the survey uses a complex design. I need to add a weight variable and, it appears, replicate weight variables. Unfortunately, Proc Ttest can accommodate a weight variable but not replicate weights.
Just to experiment, I tried running Proc Ttest with the weight variable, and the sig score for the variables improved. That confuses me, because the study documentation says "To permit the calculation of random errors due to sampling, a series of replicate weights were computed. Unless the complex nature is taken into account, estimates of the variance of a survey statistic may be biased downward." In other words, not using weights means underestimating the variance. And if the true variance is actually higher, shouldn't that reduce the significance level? One particular variable I looked at has a probt score of 0.0158 when unweighted, and 0.0025 when weighted.
Based on what I found in the study documentation, I'm trying to use Proc Surveyfreq instead. However, this is confusing me as well. The Pr > ChiSq score is now <.0001 for every variable, even those that were not significant when I used Proc Ttest.
Here is the code, with sample data and Proc TTest commented out. I'm only including 1 of the replicate weights here, but there are actually 100 of them:
data have;
infile datalines dsd dlm=',' truncover;
input ACC_HCTROUBL_r ACC_HCDELAY_r ADRD_group TM_group
PUFFWGT PUFF001;
datalines;
3,1,1,0,1310.792231,1957.576268
2,1,1,1,10621.60998,18588.46812
3,2,1,1,3042.093381,5484.728615
3,2,1,1,3166.358963,5497.289892
3,2,1,0,1481.272986,432.6313548
2,2,1,1,6147.605583,9371.965632
2,1,1,1,14001.79093,16689.25322
3,1,1,1,2035.685768,530.211881
2,1,1,1,6356.258972,1899.874476
3,2,1,0,1487.104781,2018.636444
2,1,1,0,5002.553584,1364.125425
2,2,1,1,2493.79145,4039.542597
3,2,1,0,2260.257377,3495.675613
2,2,0,1,9358.048737,2835.543292
3,2,1,1,2978.506348,4932.378916
3,2,1,1,2794.906054,5118.430973
3,1,1,0,1663.418821,519.7549258
3,2,1,0,2083.459361,3067.105973
2,1,1,0,5106.785048,8672.202644
3,1,1,1,3447.574748,854.6276748
3,2,1,1,2819.233426,899.849234
3,2,1,0,4067.38684,6463.15598
3,2,1,1,1249.96647,2053.666234
3,2,1,1,1730.237908,3058.307502
3,2,1,1,4932.936202,1479.55826
; RUN;
/*PROC TTEST plots=none data=have;
CLASS TM_group;
VAR ACC_HCTROUBL_r ACC_HCDELAY_r;
WEIGHT PUFFWGT;
REPWEIGHT PUFF001; /*REPWEIGHT PUFF001-PUFF100;*/
RUN;*/
PROC SURVEYFREQ data=have VARMETHOD = brr (fay=.30);
TABLE ACC_HCTROUBL_r ACC_HCDELAY_r * TM_group / row chisq lrchisq;
WEIGHT PUFFWGT;
REPWEIGHT PUFF001; /*REPWEIGHT PUFF001-PUFF100;*/
WHERE ADRD_group ^= 1;
RUN;
... View more
Hi folks,
I am attempting to read an xlsx file into SAS without SAS doing anything to modify the the variable types or the characters as they appear when viewing the xlsx file in Excel. In other words, if something looks like a date, regardless of how it's stored in the xlsx file (e.g., "11/25/2023"), it'll show up in sas as a character variable as that same value (e.g., "11/25/2023"). In other words, I'd like to replicate what the Import Data wizard in SAS does when it just defines all fields as character, as in the following attributes for one particular column, for example:
type=String
Source Informat=$CHAR29
Len.=29
Output Format=$CHAR29.
Output Informat=$CHAR29.
The code automatically generated from the Import Data wizard when I do this contains the DATALINES4 statement, which I can't use because I can't just fill all of these in, as I'm trying to load the data in without having to look at the contents. I'd be fine, though, if I can define the variable types/lengths/formats/etc., as I know what the variables are and what they'll be named.
How do I do this without having to fill in the DATALINES4 content?
I need to be able to specify the sheet within the Excel workbook in question as well.
Thanks!
... View more
I have a scenario as follows:
data have; input id var1 var2; datalines;
1 . 5 2 3 4 3 4 5 5 0 6 6 9 8 ; run;
%let nvar = 2;
%let v1 = 5; /*Condition for var1*/
%let v2 = 8; /*Condition for var2*/
When i run this array , i am getting error.
data want; set have; array var{2} var1 : var&nvar; array flag{2} flag1-flag&nvar; do i = 1 to 2; if var{i} >= 5 then flag{i} = 1; else flag{i} = 0; end; drop i; run;
This above code run correct but what i need is that instead of var{i} >= 5 i have to replace by var{i} >= &v{i} so i make
the condition to change as i changes.
However this modification does not work ,saying it can not resolve &v{i}.
Any suggestion ? Thank you.
... View more
Hi guys,
suppose to have the following dataset:
data DB;
input ID :$20.Admission:date9. Discharge:date9. Diagnosis :$20.;
format Admission date9. Discharge date9.;
cards;
0001 06DEC2014 14DEC2014 VIRUS_A
0001 08NOV2020 11NOV2020 FLU
0004 14MAY2014 02JUN2014 FLU
0004 30JUN2015 15AUG2015 FLU
0004 16FEB2019 18FEB2019 VIRUS_A
0005 10AUG2019 11SEPT2019 FLU
....
;
I have to fit a time-series model to estimate the weekly number of hospitalizations for VIRUS_A. The dataset shown is just an example of the real dataset. I don't know how to set the "weekly" from the admission dates I have (I have also discharge dates). The study starts on 2014 but some patients are hospitalized after the start others at different months of the 2014. Moreover, does the week number start from the 01 January? If yes, what's happens if 01 Jan is in the middle of the week?
Apart the practical SAS programming, I also have not clear the theory behind mapping dates to weeks. It's the first time I deal with this data and questions.
Thank you in advance
... View more