Where does AI actually help recreation organizations? A YMCA COO shares his perspective

What does AI actually look like inside community organizations? In this webinar recap, a YMCA COO shares practical insights on AI adoption, staff impact, and how recreation leaders are approaching new technology thoughtfully.

Alanna Crochetiere
Alanna Crochetiere
March 12, 2026 2 min read

When we shared our recent research on AI in recreation, one theme came through clearly: organizations are curious, but cautious. Familiarity with AI is high. Expectations are growing. And yet many community-led organizations are still deciding where AI truly belongs.

To explore these findings further, we hosted a webinar bringing together research insights and real-world perspectives from recreation leaders across the industry.

One of those leaders was Greg Hatzisavvas, Chief Operating Officer at the Westfield Area YMCA. Greg has spent more than 17 years with the YMCA, overseeing areas including fitness, sports, aquatics, member services, and now all YMCA programs, marketing, and operations.

Greg brought exactly the kind of perspective the conversation needed: practical, thoughtful, and grounded in the day-to-day realities of running a community organization. His comments reinforced an important truth behind the research: in recreation, the question is not whether AI matters. It is how to use it in ways that actually support staff, members, and mission.

If you'd like to hear the full discussion, you can watch the webinar recording here.

Here are a few of the biggest insights Greg shared.

AI adoption is not being slowed by one thing

One of the strongest findings in our research was that AI adoption is not being held back by a single barrier. Instead, organizations are balancing a mix of factors: capacity, trust, value, budget, and competing priorities.

Greg’s response made that reality feel very tangible.

He pointed first to privacy.

“Privacy has to be at the forefront of any technology decision. The safety of our members’ data, as well as our staff’s information, is critical. That responsibility has to guide how we evaluate and adopt new tools.”

For a YMCA, that means thinking not just about members, but also staff data and the broader responsibility that comes with stewarding community trust.

At the same time, he also spoke to a reality many recreation professionals will recognize immediately: even when there is interest in innovation, the day-to-day pace of the work makes it hard to carve out time for strategy.

“In the YMCA space, every day can look different. You come in with a plan, but priorities can shift quickly. That makes it challenging to carve out the time to step back and plan a long-term strategy around new technology.”

That insight helps explain why AI can feel both important and hard to prioritize. It is not that organizations do not care. It is that they are trying to balance future planning with the urgency of serving people right now.

The most helpful AI use cases are often the simplest ones

Greg also described how some of the most useful AI ideas at the Westfield Area YMCA have come from simple conversations with staff.

In one recent discussion, his marketing team was reviewing a department whose program registrations were not trending as strongly as they had in previous months. As the team talked through possible reasons, one staff member pointed out that the program descriptions had not been updated in a few years.

They suggested using AI to refresh the descriptions and make them more engaging.

“Sometimes the best ideas come simply from talking with your team. One person might see an opportunity to use AI that others hadn’t considered. Especially with a technology that’s evolving this quickly, those conversations can surface solutions you might not have discovered otherwise.”

The suggestion was simple, but practical. Rather than trying to implement a large AI initiative, the team identified a small, immediate way to improve how programs were presented to families.

That kind of collaborative discovery, Greg noted, is often how AI adoption begins inside organizations.

“What I might think is a useful application of AI could be completely different from what someone else on the team sees. Having those conversations helps surface ideas you might not otherwise think about.”

In other words, AI adoption doesn’t always start with a strategy document. Sometimes it starts with a team conversation about a real problem — and someone asking, could AI help with this?

Staff buy-in grows when people see real value

One of the most compelling moments in the webinar came when Greg described a staff member encountering AI in a practical work setting for the first time.

His team was reworking language, and someone in the room who had never really used AI before saw how quickly it helped generate and refine ideas.

“One of our team members hadn’t really used AI before, and when they saw how quickly it helped us reshape the language we were working on, their reaction was essentially, ‘This is pretty useful. How can I start using it in my own work?’”

That reaction reflects something many organizations are experiencing. AI often becomes more approachable not when people hear abstract claims about it, but when they see a specific, low-risk use case that helps them in their own work.

Greg’s framing here aligned closely with our research. Caution does not mean resistance. In many cases, it simply means people want to understand the value before they invest more deeply.

Human connection is still the standard

One of the strongest throughlines in both the research and Greg’s perspective was this: people do not want technology to replace the human value that community organizations provide.

That came through especially clearly when the conversation turned to member communication.

Greg noted that families do notice when communication feels overly automated or inauthentic.

“If communication starts to feel too artificial or overly polished, people can sense that. Authenticity still matters, especially in community organizations where relationships are so important.”

That observation echoes one of the clearest findings in our survey: even when digital tools are used, community members still want experiences to feel personal and supportive.

For organizations evaluating AI, that is an important reminder. The goal is not to make every interaction feel more automated. The goal is to use technology in ways that preserve and strengthen the human side of the experience.

AI can support staff without replacing them

This was one of the most encouraging parts of the discussion.

When we asked Greg about Cecilia, Amilia’s AI assistant for admins and staff, his answer was not about novelty. It was about usefulness, especially for teams that are not always on-site or deeply familiar with the system.

“Having a tool that staff can quickly turn to for answers—especially when they’re working limited hours or covering a shift—can be extremely valuable. It helps them find what they need without always relying on another person being available.”

He also pointed out why this matters so much in a YMCA setting. Not every team member works full time. Not everyone has the same level of system knowledge. And not every question comes up during a standard workday.

“In an environment where many staff work part-time or limited shifts, it’s not realistic for everyone to remember every detail about the system. Tools that help staff find answers quickly can make a real difference.”

That perspective lines up closely with another major finding from the research: community members are broadly supportive of AI being used for repetitive, behind-the-scenes work when it helps staff spend more time supporting people directly.

In other words, AI is most welcome when it protects the human role rather than replacing it.

Community organizations do not need to rush

If there was one broader message that came through in Greg’s comments, it was this: thoughtful progress is still progress.

He repeatedly returned to the idea that organizations need to assess where they are losing time and momentum, identify where AI might actually help, and build internal policy from there.

“The starting point is understanding where your organization is losing time or momentum. From there, you can build internal policies and determine which tools actually support those needs.”

That is a grounded, operator-led way to think about AI adoption. It does not dismiss the opportunity. But it also does not treat speed as the goal.

For community organizations, that may be the most important takeaway of all.

What leaders can take from Greg’s perspective

Greg’s comments reinforced much of what we heard in the research, but they also gave those insights more texture.

AI adoption in community organizations is not being shaped by hype. It is being shaped by real questions:

  • Where does this actually help?
  • How do we protect trust?
  • What makes sense for our staff and members?
  • What supports our mission rather than distracting from it?

That is why the conversation around AI in recreation needs voices like Greg’s. Because ultimately, this is not just about technology. It is about stewardship.

And in community-led organizations, stewardship still starts with people.

Want to hear the full conversation and Greg’s perspective firsthand?

Watch the full webinar recording to explore the research findings, hear more insights from industry leaders, and learn how recreation organizations are approaching AI today.

Watch now