A Dozen Policy Questions You Can Answer with Your Agency’s Administrative Data


>>JULIA ISAACS: Well welcome everyone. My
name is Julia Isaacs, and I am so pleased to welcome you to this webinar on a dozen
policy questions you can answer with your agency’s administrative data. This is the
first in a series of webinars for CCDF Lead Agencies on building your capacity as a CCDF
Lead Agency to use data in policy decisions. Today’s webinar is brought to you by the
Center for Supporting Research on CCDBG Implementation and is cosponsored by the Child Care Administrative
Data Analysis Center, also known as CCADAC. In the next slide, I provide a brief overview
of today’s webinar. We begin with welcome and introductions. Then I will share some
sample policy questions that states have answered with administrative data. I will then turn
the webinar over to Kelly Maxwell of Child Trends who will review tips in using administrative
data. Then we get to hear from presenters from two states – Massachusetts and Georgia
– who will share their experiences in using administrative data to answer their policy
questions. Finally, we will have an open discussion based on your questions and reflections. So,
I’m going to encourage you to submit questions throughout, using the question box, as explained
by our project manager, Eleanor Lauderback, in our next slide.>>ELEANOR (ELLIE) LAUDERBACK: Thanks Julia.
Welcome everyone to the webinar, I’ll quickly go through some logistics. First of all, the
webinar is being recorded, and the recording will be posted online after the webinar. We
have a large group today, about 60 people online at this point, and so I have muted
all participants. If you would like to share questions or comments, please type them into
the Questions box. I have included a screenshot of the GoToWebinar panel in the slide, highlighting
where the questions box is located in blue. We encourage you to send in your questions
as it will help make the webinar more interactive. We also would like to know more about who
is listening in today, so I will open up a poll and ask you to tell us which of the following
groups you fall into.>>JULIA ISAACS: Thank you Ellie. As you can
see from this poll on your screen, we’re asking you to select one of five choices.
We’re limited to five groups with this technology. Are you, please select one: CCDF Lead agency
staff (from any part of the agency), State, territory, tribal or local staff from other
agencies, Technical Assistance providers or federal staff (grouping you together for this
purpose), External researchers, by that I mean those who are not in a CCDF or other
state agency, Or anyone else with a role I haven’t yet mentioned. So please indicate your response on the screen.
I’ll wait a minute until Eleanor closes the poll and pulls up the results. So, the poll results show that our audience
is. Ok. Sorry Eleanor, I did something to my screen so I can’t see it temporarily.
I’ll fix that but can you read off what we have?>>ELEANOR (ELLIE) LAUDERBACK: Yes. So it
looks like we have about 37% CCDF lead agency staff, 2% other state/territory/tribal/local
agency staff, 35% TA providers and federal staff, 18% external researchers, And 8% others.>>JULIA ISAACS: Well, it’s great to have
all of you on this webinar and thank you for joining us. Now let’s go back to the slides.
I’ll kick-off the formal presentation by sharing a dozen sample policy questions that
can be answered with administrative data. To re-introduce myself for those joining late,
I am Julia Isaacs, a senior fellow at the Urban Institute and the director of the Center
for Supporting Research on CCDBG Implementation. So as the next slide shows, the Center for
Supporting Research on CCDBG Implementation is supported through the Office of Planning,
Research and Evaluation (OPRE) in the Administration for Children and Families (ACF) and managed
through a contract with the Urban Institute And I’d like to thank our project officers,
Meryl Barofsky and Alysia Blandon. So the goal of the Center is basically to
support CCDF Lead Agencies in building research capacity, particularly capacity to evaluate
the policy changes you are making in response to the 2014 reauthorization of CCDBG. We have a number of different activities.
There’s this webinar series, which we’re launching today. We’ve also developed a
number of written resources, which I will highlight on my final slide. And you may have
heard that we’ve been supporting grantees that have CCDBG Implementation Research and
Evaluation Planning and/or Implementation Grants. We’ve been working with about 11
CCDF lead agencies over the past two and a half years. You’ll be hearing from the project
directors from two of those grants in just a few minutes. We’ve been talking among
ourselves, the 11 agencies, on monthly web meetings and we’re delighted to open this
up to a larger webinar for all of you. I’ll take the next slide. Our topic today is administrative data. I
want to start with a definition used by Kelly Maxwell and others at CCADAC. Administrative
data is information about children, families, or service providers that is collected and
maintained as part of regular program administration. Of course, CCDF administrators and their staff
are surrounded by a lot of data on day-to-day operations. Data on licensing, QRIS, subsidy
and eligibility payments, and other aspects of program operations. This goal of this webinar
is to support you in using those existing administrative data to address policy questions
that might come at you from state legislators, agency heads, local child care providers,
and other stakeholders. And I thought it might be helpful to consider actual questions that
other CCDF lead agencies have addressed with administrative data. I’ve selected questions
that range in complexity. So some of the questions can be answered with just one data set and
some will require linkages over time or across data sets. Let’s start on the next slide with sample
questions drawing on licensing data. First question. Which areas of licensing are
most often cited as areas of noncompliance? You might also ask which providers receive
licensing citations and what does this imply? And I’ll give you a preview that this first
question about licensing will be addressed by Jocelyn Bowne, our speaker from Massachusetts. Next question on licensing, question 2. What
are the characteristics of licensed providers in the state? For example, I know South Carolina
asked the question what percentage of nonprofit licensed providers have a religious affiliation?
In response to a question they (the state) had gotten from a policymaker. Third question. How do characteristics of
licensed home-based providers change after implementation of the various state policies
you are adopting in response to the 2014 Act? You might wonder how licensed home providers
overall changed and then particularly look at the changes among those serving children
with subsidies. I have put a star to indicate that to look at those serving children with
subsidies, it would require linked administrative data to link your licensing data and your
subsidy data so it would be a little bit more complex. But, some version of this question
is probably of interest to a number of you, as there is changes in the home-based providers
and it’s important to track how their numbers and their characteristics may change with
the implementation of new policies. Now I wonder if any of you have suggestions
of other questions that can be addressed with licensing data. If you have an example of
something you’re working on or done in the past, please type it in the questions box.
And then when I finish with my dozen examples, I’ll ask Ellie to help me share some that
you have typed in. But moving on in the next slide, I’m going
to share three questions related to QRIS data. Again as a preview, Randy Hudgins, who will
be our speaker from Georgia will be sharing how his agency has analyzed QRIS data to help
answer some operational questions. Okay our next sample question is, Is the quality
of programs participating in QRIS improving over time? My fifth sample question, What are the characteristics
of programs that improve their QRIS ratings over time? And number six, Do more high-quality providers
participate in the subsidy program after implementation of higher tiered reimbursement rates? And
this last question I star because it requires linking your QRIS data to your subsidy data.
Now again, I would welcome you if you would like to share a question that you have addressed
using QRIS data. Just type it into the questions box. I’ll pause a moment to let you think
and type. On the next slide, we have questions that
can be answered with subsidy data. I think I’m up to question seven. To what
extent has implementation of 12 month eligibility impacted the number and/or characteristics
of children and families enrolled in the subsidy program? I know some lead agencies have been
wondering whether some categories of children may decline if other families are staying
on for longer periods. Of course we don’t know if families are staying on for longer
periods and that’s my obvious next question: Question 8. How has the implementation of
12 month eligibility impacted subsidy spell length? And I star this one as a more complex
question, which requires developing longitudinal files, where you link your subsidy data across
many months, where you define subsidy spells, when are families receiving subsidies and
which months is there non-receipt. So, whereas with question 7 you could probably answer
just by comparing your subsidy population at two different points in time and learn
something interesting about whether there were changes in the number or characteristics
of families. If you want to look at subsidy spell length, you would need to do some more
complex analyses of linked-longitudinal data. And question 9, the last question on subsidies.
Which subgroups of families and children demonstrate the greatest levels of instability? Again a more complex question because requiring
longitudinal data and sometimes these analyses are done where you link the subsidy data to
additional data like earnings records to really learn more about family characteristics. And
of course to do this one you have to come up with a definition of instability, which
I could imagine being done a couple of different ways. So if you have other questions you are analyzing
using subsidy eligibility or subsidy payment data, please type them into the Questions
box. Okay, the next slide shows my last 3 questions
of my dozen, and I draw on 3 other types of data because of course you have lots of datasets.
For example, if you have access to a workforce registry with decent data, you could ask: Question 10. What are the credentials of the
early care and education workforce? Or, next question, if you have a partnership
with CCR&R, you could find out, What are the child care needs of families seeking help
from CCR&Rs? And, if you have data that tracks technical
assistance, you could ask, What are the characteristics of teachers and family child care providers
who receive onsite technical assistance? So I’ve included these questions to remind
you of the diversity of data sets you have available, and of course that means that there
are all kinds of different kinds of questions the data can help you answer. Okay, I’ve been talking for a while, so
with my final slide, I’d like to get to the first audience participation segment.
I’ve pulled these from briefs and things, showing real examples of what lead agencies
have done but perhaps you have additional examples. Some of you may have already shared
them earlier, but others, could you take a minute to type them into the Questions Box. And Eleanor, could you read out, do we have
any sample questions that have been submitted to date?>>ELEANOR (ELLIE) LAUDERBACK: Yes, we do,
so the first question is about licensing and QRIS data. The question is, what is the relationship
between FCC and QRIS in comparison to closures of FCC? And that’s a project they’re starting
soon it looks like in Minnesota. And then another question we got, what are
characteristics of families selecting different types of child care? And then I can do one more. Minnesota is also
currently connecting CCAP child-level data to see the percentage of Minnesota CCAP children
in QRIS?>>JULIA ISAACS: Thank you Eleanor. That first
one with the relationship of family child-care with QRIS, I bet Minnesota’s not the only
state that’s curious about that. That’s a great example of a question. Any others,
Eleanor, before I turn it over?>>ELEANOR (ELLIE) LAUDERBACK: No, nothing
else right now.>>JULIA ISAACS: Okay, well thanks everyone
for sharing those. Now from the next slide you’ll see that in a minute I’m going
to turn this over to Kelly Maxwell, who’s the Co-Director of Early Childhood Research
at Child Trends and project lead for the Child Care Administrative Data Analysis Center (CCADAC).
Many of you may know Kelly, who has worked with states on early childhood evaluations
for more than 20 years. Over to you, Kelly.>>KELLY MAXWELL: Thanks Julia. I’m happy
to be part of this webinar today and I’m going to share some tips with you about using
administrative data. Next slide please. As Julia mentioned, I lead a project called
CCADAC, which stands for the Child Care Administrative Data Analysis Center.
CCADAC is supported by the Office of Planning, Research, and Evaluation in the Administration
for Children and Families, in the US Department of Health and Human Services, with funds set
aside for research in the Child Care and Development Block Grant Act. CCADAC is part of a larger
contract with Child Trends to support child care and early education policy analyses.
Ivelisse Martinez-Beck is the OPRE project officer for the contract.
The primary purpose of CCADAC is to support the use of administrative data to address
policy-relevant early care and education questions for state child care administrators and their
research partners. I’d like to thank Kathleen Dwyer and Jenessa Malin, our OPRE team leaders
for CCADAC, for their support and guidance throughout this project. Next slide. I will begin by highlighting a few benefits
of using administrative data. First, it’s relatively low cost because
it’s data that the agency is already collecting, so you don’t need to collect new data.
Second, it’s the only source of information for some questions. If you want to know who’s
receiving a service, for instance, administrative data is likely the best source.
Third, the agency has staff who are knowledgeable of the data program staff who use the data
can help interpret analysis, and IT or data staff who understand the particular data elements
in the dataset. This in-house knowledge can make it easier to analyze and interpret the
data. Another benefit is that you likely have access
to data from multiple years to document changes over time. You could, for instance, examine
data about the number of licensed family child care providers over the last 5 years to determine
whether there has been a decline in these providers over time. I have listed on this slide a few examples
of possible sources for administrative data that could be used to address policy-relevant
questions. Julia already mentioned the first three, licensing, subsidy, and QRIS. Other
early care and education data sources include: pre-K, Head Start, workforce registries, Child
Care Resource & Referral, and Technical Assistance data. There are also data sources outside
of early care and education that might be helpful, and I’ve listed two here, Temporary
Assistance for Needy Families and Supplemental Nutrition Assistance Program. The CCADAC team
has a forthcoming resource that describes these and other administrative datasets that
CCDF staff could use to address policy questions. Next slide please. All data have limitations, and I have listed
4 possible limitations in using administrative data.
The first is that data quality varies. Staff may be great about consistently entering data
for some data elements, and others might be messier. When you’re considering answering
a question with administrative data, it’s important to understand the quality of the
data elements you’re considering so that you can choose the data elements you have
the most confidence in. When selecting data to analyze, you may want to consider a few
things like whether there is a document that describes each data element or whether there
are procedures for checking for possible errors in the data. A second common issue is limitations in the
data systems themselves. Many states have older data systems, or some systems that were
built at different times so they may not easily talk to one another. When analyzing administrative
data, it’s important to work within the constraints of the data system, and you may
need to think creatively to overcome some challenges. If, for example, it takes a long
time for an external vendor to run a report with some data, there might be something that
staff in the agency can run to provide at least some of the relevant information for
you. A third issue is that there may be limited
documentation about the data. If you’re analyzing the data and you are not familiar
with it, you might need to find the right person who knows the data well enough to help
you decide which data elements to use and how to interpret the findings.
Finally, administrative data are limited to program participants. If you have a question
about the low-income families who participate in the child care subsidy system, then administrative
data are perfect. If, on the other hand, you’d like to know about all of the low-income families
who are eligible for child care subsidies, then it’s important to acknowledge that
the administrative data tell you about some of those families, but not all of them. Next
slide please. This slide includes a few tips in using administrative
data. The first tip is to match the data with the
question. Start with your question and then review the administrative data to determine
which data elements might be useful in answering the question. If the data don’t quite address
the question, then you might have to tweak it to ensure that it’s something that can
be answered with the data. For example, if a legislator asked you, What’s the quality
of child care programs in our state?, you likely don’t have quality information on
ALL the programs in your state. You might have to revise the question to something like,
What percentage of licensed programs participate in the QRIS? or, Of those early care and education
programs participating in QRIS, what’s the distribution of programs at each rating level? Second, work closely with program and data
or research staff. Program staff understand the program well and know why particular data
are important. They may also be the ones entering the data, or supervising those who enter the
data, and know about the quality of data and can tell you which data they trust and don’t
trust. Data or research staff often have the skills to extract the data, combine it, run
reports, and analyze the data. So when using administrative data, both types of staff will
likely need to work together on the project. Third, include the limitations of the data
when reporting it. The data may not cover all programs in the state or may only be available
for certain years. Note those limitations in any report or presentation to help people
appropriately interpret the findings. And finally, develop a plan for linking data,
if needed. As Julia mentioned, some questions will require combining information from multiple
data sets, though it depends in part on how your data system is structured. If you need
to link or combine data, then work closely with someone who has done this before as well
as program staff who know the data well. Together, you can determine the feasibility of linking
the data and determine how best to match programs or people across the multiple datasets. Next
slide. I thought it would be helpful to point you
to some resources that can help you address some of the issues in using administrative
data to answer policy questions. The CCADAC team has developed several resources that
are available on the OPRE website. The slide title has a hyperlink to take you to the landing
page. And I’ll briefly mention a few resources. We have a resource to help determine whether
it’s feasible to use administrative data to answer a question of interest. Once you
determine that it’s feasible, there is another resource that describes some considerations
in getting ready to analyze administrative data, like preparing a dataset. A third resource
describes common components of a data sharing agreement, outlines steps in developing an
agreement, and includes a few examples. The final resource on this slide describes three
research partnerships between state agencies and researchers, provides examples of questions
answered by those partnerships, and describes the benefits and challenges of establishing
a strong research partnership. Next slide. Finally, I want to let you know that the CCADAC
team hosts an online discussion forum to support peer learning among researchers who are using,
or interested in using, administrative data. So if you’re interested in this topic, I
hope you will join the forum and ask questions, share resources, and offer your own tips.
Everyone is welcome! I have included the hyperlink on this slide. You can also email me if you
have questions about this. Thank you, and now I’ll turn it back to
you Julia.>>JULIA ISAACS: Thanks, Kelly. As you can
see, I’m going to move onto the next slide. Our first state speaker will be Jocelyn Bowne,
who is the Director of Research and Preschool Expansion Grant Administration in the Massachusetts
Department of Early Care and Education (EEC). Jocelyn coordinates a number of grant-funded
initiatives that support research and data use at EEC, including the CCDBG Implementation
Evaluation and Research Planning Grant. She also manages a grant initiative which funds
local efforts to increase access to preschool and build program quality and alignment. Now,
as Jocelyn is speaking, do remember, if you have questions you can type them into the
Questions box, and Eleanor will share them with us at the end during the open question
session. Now, over to you, Jocelyn.>>JOCELYN BOWNE: Thank you Julia. You can
go to the next slide. As Julia mentioned, I work at the Department of Early Education
and Care, which is responsible for a number of functions; licensing of private early education
programs across the state, of both center-based as well as family child care, developing and
managing the state’s Quality Rating and Improvement System, otherwise known as the
QRIS, administering child care subsidies and managing a number of grants that support program
quality in different ways. Next slide. The policy question I will be discussing today
is, Where do providers struggle the most to meet licensing and QRIS standards? And to
provide the policy context and motivation for the question, it’s important to understand
that EEC is currently in the process of revising our QRIS system and wants to create a new
professional support system that is aligned with the expectations of the new QRIS but
we also want to make sure we’re meeting the needs of programs in our system. And in
doing so, we want to be sure that we understand the challenges that programs face at multiple
levels of our system, including challenges with basic licensing compliance, which we
see as the foundation for quality, as well as potential challenges presented by the expectations
of the new QRIS. In this presentation I will share two different,
relatively straight-forward analyses we conducted, one using our licensing data and one our QRIS
data. And, as I will discuss, the results have informed the ongoing development of the
QRIS as well as our work with two primary partners, the StrongStart Career Pathways
grant, which is going to all the community colleges in the state in support of providing
better pathways to degrees and competencies to individual educators, and the StrongStart
Training and Technical Assistance grant, otherwise known as the SSTTA, to a local university
to provide statewide training and technical assistance, both to individuals and to programs.
Next slide. Our first analysis considered what we might
learn from our licensing data to understand the aspects of licensing compliance that have
been most challenging to programs. To do this we reviewed 3 years of licensing citations
from all visits to understand the most commonly cited regulations. EEC has 8,281 licensed
programs, or at least they did at the time that the sample was taken, and the sample
includes about 97,000 citations. So that’s an average of 4 citations per program per
year although some programs did not have any citations and some had many more than that.
And when we look at the citations, as you can see from the pie chart here, the largest
category of citations, 27%, were for administrative issues, 23% were for health and safety violations,
22% were for physical facility violations. Following these larger categories, 8% were
for teacher qualifications, 6% respectively for ratios and supervision as well as curriculum
and interactions, 4% for nutrition issues and another 4% of miscellaneous violations
across remaining categories. So to make sense of the implications of these
findings for professional support needs, we wanted to check our understanding of what
the data was telling us by seeing how licensors viewed these violations, and whether the pattern
that we see here was at all surprising to them or to understand whether they felt that
there was more information we needed or whether there was particular ways they would like
help in supporting programs. We brought these results to the Regional Directors (RDs) and
asked for their reactions. And they weren’t surprised to see these results but they did
feel that licensors could provide TA effectively on some of the largest categories of citations
around concrete compliance issues, such as administration and qualification, physical
facility and some of the health and safety violations. These were violations that really
required fairly concrete guidance surrounded by organizing student or staff folders, the
depth of mulch in the playground or proper labeling of bleach bottles. On the other hand,
the RDs felt that issues such as problems with ratios and supervision, curriculum and
instruction, although they showed at a lower frequency, tended to reflect more serious
issues that a program might have and required more in-depth support than they could provide.
And this is also true of some health and safety violations. As a result, rather than focusing
on the highest frequency violations and thinking about how we could meet the licensors’ needs
we’re seeing, we have considered ways to provide the supports requested. In addition
to developing courses related to health and safety for a new learning management system,
(which is designed in part for CCDBG compliance so there is overlap there), we are also ensuring
that these key topics are addressed through the SSTTA grantee offerings and that licensors
have the ability to refer programs directly to these supports. Next slide. Our next analysis looked at our QRIS data,
which, in our current system, is fairly limited. The system was designed solely to manage the
application process, is old enough that it is difficult to update to align with new requirements
and does not collect much information about programs beyond the steps taken to be granted
a new level. We have some program information about program type, enrollment and whether
the program is part of a larger agency. And we have information about application status
and activity, which includes timelines and any exemptions granted, which I will explain
in a few minutes. And finally, we have the level granted as a result of each application.
And while we would like to review a really rich profile of all the different dimensions
of program quality that programs have shown, as well as see the results of some of the
standardized classroom observations we collect, we really wanted to think about how we could
work with the data that we had. And before I get into the solution, just to
give you a sense of the scale of our QRIS, it is a 4 level system and currently 63% of
our programs participate (which is up from just over 50% a few years ago). Most of the
participating programs are at level 1 – that’s 63%, while 29% are at level 2 and only 5%
at level 3 and 4. These last two levels are the hardest levels to achieve as they have
far more stringent expectations and also include the use of external standardized observations
of quality with the ERS tools. Next slide. So, as we were puzzling over these issues,
Katie Gonzalez, a Harvard fellow who was working with us, proposed that we could use the exemptions
taken as part of this process as a window in the current barriers for programs. To understand
the exemption system, programs are allowed to request up to 4 exemptions to particular
verification criteria to move to level 2 and above. Katie reviewed the full history of
exemptions taken through May 2017 at the point at which each application was processed. For
this sample, she looked at all programs in the QRIS at the time, 5,245, and all 7,342
exemptions requested – so over half the programs moving through our QRIS system have
been granted at least one exemption. Next slide. In looking at the exemptions (to understand
current barriers to advancement as well the potential barriers the new QRIS system we
were planning might set up), we categorized the exemptions to better understand the patterns.
Workforce-related exemptions were those taken around educator qualification requirements
at different levels. For example, 50% of educators are required to have BAs for advancement to
higher levels of the QRIS. There were also exemptions around formal PD requirements – which
include a fairly extensive set of requirements for particular trainings, all of which have
to be CEU bearing, that are expected of educators and directors. We also categorized some criteria as aligned
with what we’re calling QRIS 2.0, it’s a system that we are currently planning. We
are moving to a system with fewer rigid requirements and greater support for programs’ continuous
quality improvement efforts. The QRIS 2.0 category includes the use of observational
tools, a program’s continuous quality improvement plan, individual educator professional development
plans and the use of child screening and assessment data. Next slide. When we look at where the bulk of the exemptions
fall, we find the majority are related to workforce requirements (with 51% related to
workforce qualifications), while 25% related to formal PD requirements. Only 8% related
to the QRIS 2.0 aligned requirements and 4% were in another miscellaneous category, (which
included physical facility requirements, which are a particular challenge for family child
care programs). These results were reassuring to us as they support our belief that the
new QRIS system will be less onerous to programs, but also highlight the challenges the field
continues to face with a limited pipeline of teachers with degrees and access to PD
opportunities; and these are issues we wanted to ensure we could continue to address in
supportive ways with our new system. Next slide. So, to summarize the conclusions we’ve drawn
from these two analyses is that, for the QRIS, the movement towards more flexible training
and competency-based qualifications will hopefully remove some barriers in our current QRIS.
The new QRIS is also adding increased expectations and support to programs around using curriculum
effectively and ensuring educators have access to job-embedded professional learning, which
should help to address some of the more irksome challenges that licensors have noted in working
with programs. And, as I added, we are creating a direct connection between licensors and
these supports so licensors can refer programs about whom they’re particularly concerned.
The professional support system will also include our collaboration with community colleges,
through the Career Pathways grant, in providing accessible coursework and more extensive articulation
to degrees, such as through a new CDA program, which we’ll articulate. We are very aware
of the challenge and the importance of moving teachers toward CDAs, but want to address
it on the supportive end. The TA and the coaching provided by our SSTTA grantee will be required
to address topics identified as a need by EEC and by program request, and most importantly,
will provide coaching to program leadership in program management and providing job-embedded
learning opportunities for educators, which we hope will also address both our desire
to build quality in meaningful ways but also address some of the issues noted by licensors. So the review of these different sets of data
points provided information that both supported and expanded our design of our professional
support system, and our review of the QRIS system, which is currently in process. Thank
you, I’ll turn it back to you, Julia.>>JULIA ISAACS: Well, thank you, Jocelyn. That
was a lot to take in but I appreciate it and I appreciate how pulling from two different
data sets and really using the data helped to refine the development of the new QRIS
and the new professional development system. Okay, if you have questions for Jocelyn, type
them in, we’re not going to take them yet. With our NEXT SLIDE, we’re going to turn
to Randy Hudgins, who is the Director for Research and Policy Analysis with the Georgia
Department of Early Care and Learning. His team manages departmental research and performs
administrative analyses that support program leadership in implementing policy. He also
manages the CCDBG Implementation Research and Evaluation Grant that is currently funding
research focused on understanding the child care landscape in Georgia. Okay, Randy, I’ll
turn it over to you.>>RANDY HUDGINS: Great, thank you, Julia,
and good afternoon everyone. We can skip to the next slide. As Julia mentioned, I’m with the Georgia
Department of Early Care and Learning, which houses many of the state’s early childhood
services. Similar to Jocelyn in Massachusetts, this department is responsible for licensing
child care across the state, developing and managing the state’s QRIS, which we refer
to as Quality Rated, and administering Georgia’s universal Pre-K program. We also administer
CCDF subsidies through our Child Care and Parent services, what we usually refer to
as CAPS, as well as the federal nutrition programs across the state. The Department
is fortunate to also house a small Research and Policy team, of which I’m a part of,
that consistently uses administrative data and research to help inform policy decisions
and strategic planning initiatives like I’m going to talk to you about today. Next slide. So today, I am going to briefly talk with
you about our 2020 goal and how the research team is working with program and departmental
leadership to dig into our administrative data to inform how we can best utilize our
resources to support providers through Quality Rated and our CAPS programs. We presented
these analyses at a recent strategic planning meeting in May where we had the opportunity
to think critically about the challenges and opportunities that are around this goal. Our
2020 strategic goal is stated, To continue receiving CCDF Subsidy, all Quality Rated
eligible providers will be star rated by December 31, 2020.
The question that we are really trying to answer with our administrative data here is
simply, How do we market Quality Rated to providers to ensure children receiving CCDF
subsidy are in quality environments by the end of 2020? So, as of May of this year, 61% of children
receiving subsidies were already in Quality Rated care and 50% of CAPS providers were
already Quality Rated. We know that we are on track to meet this goal, but we need to
know more about how to leverage our resources to best communicate and incentivize providers
receiving CAPS to become Quality Rated. Our team looked at administrative data through
five different lenses to try and help the department strategically consider the data
and how different provider settings, communities, and the overall process of becoming Quality
Rated may impact a provider’s decision to become Quality Rated. It is important to note
that for all eligible providers, becoming Quality Rated is still a voluntary process
in Georgia. Next slide. For the strategic planning meeting we discussed
in length the administrative data through these five categories shown here: (1) How
are providers going through the Quality Rated Process, (2) what do Superusers look like,
(3) How may that be different from a Family Child Care, (4) Geographically, are there
areas of the state with low Quality Rated participation but have high subsidy usage,
and (5) What does this all mean for the families receiving subsidy. We want to know, Are there
different approaches we can take to incentivize or communicate to different groups about becoming
Quality Rated? Today, I am only going to share with you examples
of administrative analyses from two of these vantage points: (1) Data around the Quality
Rated Process – how providers actually go through the process, from application, receiving
technical assistance, to achieving a star rating; and (2) Data about Superusers – or
those centers that provide care to a substantial number of children receiving subsidies. Next
slide. First, I’ll share examples about how our
providers navigate the process of becoming Quality Rated. This is a visual representation
of the current process. A provider begins the process simply by signing up and submitting
an online application to participate. A provider then begins receiving technical assistance
and professional development incentives through our Resource and Referral Agencies across
the state who help prepare providers to meet the expectations of Quality Rated. A provider
must complete a structural quality portfolio demonstrating that they have gone above and
beyond the minimum licensing requirements. This part of the process is entirely dictated
by the provider. After submitting their Portfolio, DECAL schedules an Environmental Rating Scale
observation, or an ERS observation, with that provider. The structural quality and the ERS
score are then reviewed and summed to issue a provider’s star rating. This part of the
process is dictated by DECAL’s ability to process the ratings.
In Georgia, we have a 3 star system, in which 3 stars indicates the highest quality. Every
year providers are required to verify this information in their portfolio and, every
three years, providers are required to resubmit through the entire process and are issued
a new star rating at that time. So all programs must go through this process in Georgia. Next
slide. This slide shows data from our recent Validation
Study that Child Trends worked with us on. The full reports can be found on our agency
website. The study used administrative data from December 2017 and found that programs
took, on average, about a year, or 373 days, to go through the application and portfolio
process of becoming rated. These are the sections of the rating process that individual programs
have direct control over. This is important because all programs must
go through this initial process to become rated. So, how can we help providers (specifically
those with subsidy) navigate this process more efficiently to ensure that they are Quality
Rated in the next one and a half years? For example, much of this data we collect is self-reported
in a portfolio and there may be other ways to capture the information, such as collecting
classroom ratios during licensing monitoring visits instead of through a self-reported
portfolio. Next slide. This slide also shows data from our Validation
Study using administrative data from 2017. We found that it was taking DECAL almost four
months to rate providers after their portfolio was submitted. It was taking about 14 days
for us to approve portfolios, another 54 days to complete the ERS observations, and another
47 days to review and issue a final rating. While our QR division has already taken steps
to address and significantly reduce the time between portfolio submission and rating, we
recognize that this is a specific area of the rating process that DECAL has direct control
over and may be able to reduce further. So the question here becomes, what operational
changes can DECAL make to more efficiently process ratings to ensure that all eligible
providers with subsidy are Quality Rated in the next year and a half? Next slide. This last slide I want to share concerning
the Quality Rated Process has helped us to think more critically about how we are engaging
CAPS providers who may be struggling to complete the initial process. Using administrative
data from this past May, we found that 39% of CAPS providers who started an application
but have not yet submitted a portfolio applied within the last year. 22% of CAPS providers
started an application more than a year ago, 12% more than 2 years ago, and 27% started
an application 3 or more years ago but have not yet submitted their portfolio to become
Quality Rated. You can easily imagine how each of these four groups, particularly the
39% that recently started versus the 27% that started an application 3 or more years ago
may have different reasons for not submitting to be rated. And there are likely different
approaches the agency should be taking to incentivize or communicate the importance
of becoming Quality Rated by December 2020. Next, I will dive into the Superuser data
and how we used our administrative data to communicate to our program leadership the
uniqueness of large centers. Next slide. So, in Georgia, there are about 3,083 licensed
child care learning centers in May, of which about 71%, or 2,184, are CAPS providers and
nearly 97% of all CAPS children are served in center-based care. Therefore, in terms
of our 2020 goal of having all children receiving subsidy in a Quality Rated provider, assisting
these centers through the process is critical. Fortunately, through our administrative data
we found that, as of May, centers who are CAPS providers are more likely to have a rating.
50% of CAPS providers are rated compared to only 27% of centers not serving CAPS. Next
slide. Superusers for our purposes are defined as
centers serving more than 50 children receiving subsidy. We found, using May data, that 135
of our largest unrated centers serve 45% of our targeted children, nearly 11,000 children
receiving CAPS, each serving more than 50 children receiving CAPS subsidies.
122 of these Superusers have applied and are somewhere in the process of becoming Quality
Rated but have not yet received a rating. 13 Superusers have not chosen to apply for
Quality Rated yet. This is a particularly interesting group, when you think about how
DECAL can communicate and incentivize providers to becoming Quality Rated. These providers
are choosing to missing out on large tiered reimbursement rates for subsidy payments by
not being Quality Rated, and with a large portion of their license capacity serving
children with CAPS, we thought higher reimbursement rates would strongly incentivize these providers,
however we see it is not true at all. So, what are the barriers and what would make
it worth it for these providers to become rated? Also on this slide, we see that 57 Superusers
have Georgia’s Pre-K, so how can we leverage the provider’s participation in other DECAL
grant opportunities to encourage and communicate the importance of Quality Rated? And finally, 65% of Superusers are in the
metro Atlanta counties. So, it may be important to dig further into what are geographical
barriers and how can we communicate to our metro providers differently than some of our
more rural providers. Also, are there partnerships within the metro area we need to be leveraging
to better communicate and incentivize these particular providers? So, here I have briefly touched on some of
the administrative data we are currently using to help our programs align policies and promote
quality to meet our state’s 2020 strategic goal. We only looked at two categories of
data, Quality Rated Process and Superusers, but you can imagine how we can use similar
techniques to look at administrative data in terms of the Family Child Care, geographically
across the state, and especially in terms of the individual families receiving and relying
on child care subsidies. Thanks for listening and I look forward to
your thoughts. Take it away, Julia.>>JULIA ISAACS: Thanks, Randy. So, on our
next slide, I think we’re going to see that we’re opening it up to your questions and
comments. And so, if you have a question, just type it into the Question Box. So, Eleanor,
are there any questions that came in already while Jocelyn, and Randy, and Kelly were speaking?>>ELEANOR (ELLIE) LAUDERBACK: Sure, I can
read this one. How long does it usually take to develop a data sharing agreement?>>KELLY MAXWELL: This is Kelly, I’ll start.
I can start answering that and then if Jocelyn or Randy has other experiences they can jump
in too. Of course, I don’t have a perfect answer for that, because I think it depends
on various things, like whether your agency has a template for creating a data sharing
agreement, and the number of people who have to review and approve a data sharing agreement.
In general, though, I think it could take anywhere from a few months to over a year
or more to finalize one. So, I would encourage anyone who’s thinking about doing research
that would require a data sharing agreement to begin developing it as early as possible,
knowing that it might take you several months.>>JULIA ISAACS: Jocelyn or Randy do you want
to add anything to that?>>JOCELYN BOWNE: No, I think that sums it
up. You know, from my perspective, it’s just very dependent on staff capacity and
what else is happening within the agency at the moment. And we have a small legal team
so I can see how time-consuming it can be to get something like this done on occasion.>>RANDY HUDGINS: Yeah, and I would completely
agree with Jocelyn and Kelly, it just kind of depends on your relationship with who you’re
partnering with and your relationship with your legal department, whether they’re big
or small.>>JULIA ISAACS: Great, well, while we’re
waiting for more questions, I think I’m going to ask a question for, I guess both
Jocelyn and Randy: I’m curious whether those administrative data analyses that you did,
did they confirm what you already knew, or did they have any sort of surprises or new
insights. So, my question is, were there any surprises from your analysis of administrative
data?>>JOCELYN BOWNE: So, I can start. I was personally
surprised by the number of exemptions that programs are taking to move through the QRIS.
I knew that that was an option, I did not realize how frequently it was an option that
programs took up. And I very much appreciate Katie’s work in thinking about how becoming
aware of that is a very real and important data source to look at and also thinking about
the ways in which it showed us a window into places where programs are really getting stuck
and would not be able to move through the QRIS if they didn’t have that exemption
option. I’d also add, in thinking about the licensing
analysis that we did, I was hoping for more information about some of the more instructionally-focused
aspects of licensing, so around thinking about curriculums and interactions in the classroom.
And what was interesting to me was to learn more through the conversations with the regional
directors and others. I had surprising conversations with licensing staff as well, just around
how those regs were cited. Curriculum tended to be cited as an overall broad category.
Licensors didn’t get into any of the sort of subcategories of the regulation that they
could’ve cited and they tended to cite that just when they saw complete chaos, whereas
I think when you look at some of the administrative data there are far more nuances in the way
things were cited. So, it was just really interesting to learn more about that data,
how it’s used and think about what we might learn from it.>>JULIA ISAACS: Well, let me jump in before
turning back to Randy, that Jocelyn, one thing that your analysis showed is you had some
data and then you shared those data with the licensors. So you shared it with people…we’ve
been talking about that, those of us that meet monthly, how useful it can be to find
not a 50 page report of data, but to find some key data findings and then share them
with a group of stakeholders and you learn so much by hearing their reflections on the
data that you have analyzed and shown them. Randy, were there any surprises in your administrative
analyses you showed us or did it confirm what you kind of already knew?>>RANDY HUDGINS: I’d say a little bit of
both. It was certainly surprising the reactions we got from the different program staff who
saw this data. I think for a lot of people, it confirmed a lot of assumptions. But as
we’ve been able to share these with other program staff who are actually doing the work,
when we talk about the quality rated process data, when we showed these to the different
program staff involved in the different stages of the quality rated process, it was interesting
to see how certain staff thought that their process was going quickly but didn’t have
the full picture of this entire year process for a provider. So it really helps connects
the dots for our program staff who may be implementing the program, but didn’t have
a good perspective of the overall program, or how actual providers were experiencing
the system.>>JULIA ISAACS: Yeah, that’s another great
example of…it isn’t that you’re doing a data analysis and it’s sitting on a shelf,
it’s that you’re doing a data analysis and then sharing it with people and they’re
learning from that and then you’re also learning. Eleanor, do we have any questions
or reflections from the webinar participants?>>ELEANOR (ELLIE) LAUDERBACK: We do not right
now.>>JULIA ISAACS: Well I guess I get to ask
another question then. Although please, we will welcome your questions too. I was wondering,
again for Jocelyn and Randy, although Kelly you can feel free to jump in on this one from
all your experiences, but my question is, I imagine, (I’m not going to go out on a
limb here), there are some challenges getting the administrative data ready for analysis.
Could you describe a challenge you encountered, and what you did to overcome it?>>JOCELYN BOWNE: Sure. So, I did not find
the licensing regulation data to be the easiest to work with. I suspect Massachusetts is not
unique, that the regulations are not necessarily completely coherently organized, and licensors
were citing by regulation. So even though you can think about organizing them like you
would a table – where there’s a category of regulations and then a set of regulations
and then kind of sub-regulations under each of those. It was very tricky to think about
what level we would report at and how we might create coherent categories, which ultimately
led to the decision to stay at the very high level of organizing categories and not try
to dig more deeply into some of the nuances of the data we collected, which is also a
piece of the motivator for talking to licensors about it because we didn’t want to lose
the richness of the experience and some of the more detailed understanding of the issues
that were leading to these problems. But it was very challenging to figure out how to
get that out of the data that we had. And the other piece I would add, just as an
aside, that analysis was done a couple of years ago. We’ve moved to a new licensing
system and I actually have not yet been able to figure out how to get that data out of
the new system in this format, by regulation. So, it’s something we’re still working
on but the change in the system somehow changed how the data was stored and how we could access
it in ways that make it harder to find this information in this format.>>JULIA ISAACS: Well, I’m imagining heads
nodding around the country as people…you’re not the only one, I’m sure, who has a new
data system and then finds it hard to replicate an analysis they did with the old data system.
I hear you say you’re still working on it in getting the data out of the data system.
Randy, do you have any challenges you’d want to share getting the administrative data
ready for analysis?>>RANDY HUDGINS: Yeah, and I would really
echo Jocelyn on two of those, the deciding at what level we’d want to report the data
at is always a big consideration and you can easily get into the weeds too far if you’re
not careful. And then making sure that the data that we do decide to represent, we put
it into coherent categories of a data analysis that will make sense to the program staff
we’re sharing it with. Something specific to this data is that we
had to link three data sources; our QRIS data, our licensing data and our subsidy data. Fortunately,
a lot of that is kept in house so we’ve been able to do that for a while now, so we
have some expertise in that, but that’s definitely a barrier for a lot of people.
But then, just on a more general scope, figuring out the best way to represent this data so
that we could present it during a strategic planning meeting in a way that leadership
across the department who have varying levels of understanding of the different programs
can understand and can use was definitely a challenge that we had to deal with, with
this data. Being able to draw out of a large dataset, and being able to condense it into
a few slides that you can share with leadership can be difficult at times.>>JULIA ISAACS: Well I really appreciate
that last response because I think some of us who get all into data analysis and all
the fancy things you can do with data. You know, we think our job’s done when we’ve
analyzed the data so that we know what it says, which is a very hard job, pulling the
data. But then figuring out how to present it to leadership. It’s important to build
time and think of that as a skill to facilitate data-driven conversations you want to have.
Kelly, you didn’t speak about a specific analysis today, but did you want to mention
any challenge, either a common challenge or a specific challenge you’ve encountered
using administrative data?>>KELLY MAXWELL: Well I think I’ll just
make one point to make a plug for having some kind of documentation for your data. I think
you’ve heard both Randy and Jocelyn talk about needing to figure out which data to
include in the analysis. Part of that is understanding what the various data elements are on a topic
and which ones likely have the best quality data. And many times that lives in someone’s
head in the agency. And it would be ideal if that information living in someone’s
head could be included in a document so that it could be referenced for anybody in your
agency, or a research partner if you choose to use one, can use to understand the data
elements so that they can be sure to match the question of interest with the available
data. And I know that staff have a lot of things on their list so this may be a lower
priority, but I’ll just put a plug for that and also say that if you do have any students
or researchers who are interested in working with CCDF agency staff, this could be a task
you could ask for their help on.>>JULIA ISAACS: I appreciate that suggestion.
So, I didn’t say at the beginning, but we are scheduled to go another fifteen minutes,
but we will end early depending how many questions we have. Eleanor, do we have any questions
that people have typed into the questions box?>>ELEANOR (ELLIE) LAUDERBACK: No, we don’t
have anymore.>>JULIA ISAACS: Okay. I think I will ask,
so, Kelly you shared some resources that are on CCADAC. Do you want to say a minute more
about other resources using administrative data so that people, in addition to this webinar,
have places to go to after the webinar?>>KELLY MAXWELL: Sure, thanks Julia. The
use of administrative data is increasingly popular and there are increasingly more and
more resources available. The challenge sometimes is figuring out where to find those. So, a
few years ago, CCADAC started working with the team that supports the child care and
early education Research Connections website to organize the resources and put them in
one place. So, if you go to their website, which is the Child Care and Early Education
Research Connections website, and click on the research tools tab, you’ll see a drop
down box and one of the options in that drop down box says working with administrative
data. When you click on that, it will take you to a page of resources on using administrative
data and the resources are organized into various topics like linking administrative
data or data confidentiality and security. So, after this webinar if you’re working
on research using administrative data and you have a question, I would really encourage
you to google Child Care and Early Education Research Connections, go to their website,
and find the page on working with administrative data and you’ll likely be able to find a
resource to help you. And I will also remind you that you can join the online discussion
forum, because we’re hoping that that is also a place where if you have a question,
you can ask other researchers and other folks who are analyzing administrative data a question
and get a response from the other people’s experiences.>>JULIA ISAACS: Great, thank you Kelly. And
I will, when I get to my final slide, include another place of resources. But, before I
turn to my last two slides and final announcements, I’d like to ask if anyone in the panel wants
to make a closing comment? And I guess we’d go in order, I don’t know if, Kelly, that
may have been your closing comment and you want to make another one, and then Jocelyn
and then Randy.>>KELLY MAXWELL: I’m fine, I don’t need
to say anything else. I appreciate the opportunity to be on this webinar.>>JULIA ISAACS: Oh, we are so happy that
you were able to join us. Jocelyn, did you want to add anything for folks?>>JOCELYN BOWNE: Yeah, I think that we’ve
talked a lot about understanding the data and the data quality and the one piece of
that I think I would like to add is that it’s important to understand who’s entering the
data and for what purpose, because that has a lot of ramifications for data quality as
well. One example that I can share is when you look at the licensing data, there are
all sorts of fields that the licensors can enter, some of which are required for them
to do their job as a licensor, and others which are extra information that somebody
somewhere along the way thought it would be really helpful if licensors collected. So,
for example, you can get information about whether a program is a Head Start program
or not from our licensing data, but there’s no reason that licensors care particularly,
in terms of the way they do their job. So, there is an awful lot of missing data in that
field and I think that people could probably come up with a number of other, similar examples.
But I just want to put in a plug for having a sense of where the data is coming from to
inform thinking about the quality of the data that you have.>>JULIA ISAACS: Thank you. Randy?>>RANDY HUDGINS: No, I just appreciate the
opportunity to be on the call today and look forward to working with other states as we
move forward. Thanks again.>>JULIA ISAACS: Well I thank you, Randy and
Jocelyn and Kelly, for great presentations and for this informal conversation afterwards,
which I find almost as useful as the formal presentations.
So, I’d also like to thank those of you who are listening in, and in the next slide,
we have two slides left, so next slide, I want to make sure you know about the upcoming
webinars in this series, Building Your Capacity as a CCDF Lead Agency to Use Data in Policy
Decisions. Each webinar will cover a different data and research skill. This first webinar
has started, we thought it was the starting point, with your own administrative data.
The second webinar, we titled it, Mapping Answers to Child Care Questions: Comparing
Your Administrative Data with Other Data, including Census Bureau data. So you can see
where you’re meeting the need and where there’s unmet need. We plan to schedule
that for the fall. I’ve gotten two of the speakers confirmed today, I’m happy to say,
but I’ll wait until we have them all and we find a date and we’ll send out that information
to those of you who registered for this and try to make it widely available. And then
we will hold at least one more webinar in the winter or spring of 2020, on a topic to
be determined. Now in our final slide, I want to highlight
some resources that we developed at the Center to support CCDF Lead Agencies to support you
building your research and evaluation capacity. So I’m most excited that just this week,
we have an updated version of our annotated bibliography, it’s called: Research and
Evaluation Capacity Building: A Resource Guide for Child Care and Development Fund Lead Agencies
(Revised 2019). And we issued one a year ago but we just issued
a revised one this week. There are so many resources out there in the world, so to get
you started, we provide a very concise, annotated list of written and online resources that
might be relevant to child care agencies. For example, we share five resources to help
you work with evaluators and more than two dozen resources related to working with administrative
data, including that webpage that Kelly just mentioned. So, you can find it on the OPRE
website or on the website for the Center. And we’ll also plan to send it out to those
of you who have directly registered for the webinar. I think you’ll get an email tomorrow. Other resources, which also can be found on
the OPRE web page and our Center web page. We have a self-assessment tool and session
guide. It’s called, Research and Evaluation Capacity: Self-Assessment Tool and Discussion
Guide for CCDF Lead Agencies. A really great way to do self-assessment. How are you in
your research evaluation capacity? Where do you want to build more? Where are the gaps?
Where are your strengths? We also have done a brief on Evaluating Training and Professional
Development for Home-Based Providers. And we’re working on a brief that’s not out
yet, but which will be about procuring research and evaluation services. So that gives you
a sense of the types of resources on our webpage. You can google Center for Supporting Research
on CCDBG Implementation and that will pull up the Urban webpage. So finally, if you have questions or comments
on this webinar series, feel free to email me, Julia Isaacs, and my email address is
[email protected] And I guess I’ll close. Another thank you to Eleanor Lauderback and
behind the scenes, Teresa Derrick-Mills, who’s the associate director of the Center and to
all of you for helping us kick off our first webinar in this series. Thank you.

2 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *