A case on using AI systems as personal and family virtual assistants
From The Embassy of Good Science
Cases
A case on using AI systems as personal and family virtual assistants
Related Initiative
What is this about?
This case study examines AI systems used as personal and family virtual assistants. These assistants, powered by generative language models, help users manage daily activities, preferences, and communication. One scenario involves a personal AI assistant designed for private interaction and role-play conversations. The second involves a family assistant, Aurora, which supports multiple users in a shared environment by organizing family schedules, planning activities, and managing shopping lists while handling sensitive personal and family data.
Why is this important?
This case is important because AI assistants increasingly influence how individuals and families organize daily life, communicate, and make decisions. While these systems offer convenience and support in planning activities and managing information, they also require handling sensitive personal and family data. The shared family context raises additional concerns about privacy, boundaries, and data protection. Responsible design and use of such assistants are essential to maintain trust, personal autonomy, and healthy family interactions.
For whom is this important?
IT / Technical ProfessionalTechnology Developer / EngineerResearchers, Research Ethics Committee Member
What are the best practices?
Best practices include ensuring strong data protection and clear privacy controls for personal and family information. Users should be able to review, edit, or delete stored data and understand how the assistant uses their information. Transparent communication about system functions and limitations is essential. Families should also establish shared rules on how the assistant handles personal preferences and information. Human awareness and responsible use help maintain healthy boundaries and prevent overdependence on AI assistants.
