DAHLGREN, Va. –
You sunk my battleship! That may be the cry of the teams that will compete in the Naval Surface Warfare Center Dahlgren Division (NSWCDD) Artificial Intelligence and Machine Learning Innovation Challenge at Dahlgren March 2-4, 2023. For now, college and university students nationwide are polishing their white papers due Nov. 15 for Phase 1 of the challenge.
“This is our very first prize challenge for Naval Surface Warfare Center Dahlgren Division and we could not be more excited,” NSWCDD Chief Technology Officer Jennifer Clift said during the Challenge Talk Oct. 28 via a Zoom meeting at the University of Mary Washington (UMW) Dahlgren Campus.
Carnegie Mellon University, Christopher Newport University, Cornell University, UMW, and Virginia State University were among the participants that posed their questions to a NSWCDD panel that included Clift, Dr. George Foster, Distinguished Engineer for Combat Control, Dr. Jeffrey Solka, Distinguished Scientist for Naval Data Systems, Tamara Stuart, Innovation Lab (iLab) director, and Reese VanWyen, Contracting Officer.
The challenge is to develop AI and ML algorithms and solve a problem in a way that NSWCDD has not yet explored. The prize purse totals $100,000 – 1st place is $50,000, 2nd place is $30,000, and 3rd place is $20,000 – to the winning teams’ respective schools.
NSWCDD is looking for teams of five or less and each school can have multiple teams. Up to 25 teams will be selected for Phase 2 at the UMW Dahlgren Campus.
In Phase 1, the objective of the white paper is to demonstrate the team’s knowledge of advanced AI/ML skills, their ability to develop and apply advanced AI/ML algorithms and their strategy and plans to approach Phase 2. White papers are to be submitted to the Challenge.gov website at https://www.challenge.gov/?challenge=artificial-intelligence-(ai)-and-machine-learning-(ml)-algorithm-development-challenge and should not exceed five pages, single sided, using Times New Roman, 12-point or larger font. Challenge Talk questions and answers will be posted to the website.
Clift highlighted the white paper criteria and provided examples for consideration.
1.) Describe specific AI/ML experience of the team in algorithm design, development and application.
“You could talk about relevant courses that you’re taking in AI and ML – the percentage of the team who has a course or total number of courses in AI and ML. That could be machine learning, data sciences or deep learning, for instance. Have the students completed course work in Python or Java?” Clift inquired.
2.) Describe the benefits of the team’s participation in the challenge and how each member plans to use this experience in their future academic and/or career endeavors.
“Are there opportunities for future capstones or experiential learning? That’s not required but just an opportunity, just a consideration. Do you plan to pursue a career or graduate education in these particular disciplines? Are any of the team members already planning to go forward in AI/ML or data sciences? Are there benefits to your existing course work or research on this particular challenge?” Clift inquired.
3.) Describe the team’s approach toward designing, developing and training AI/ML algorithms for the purpose of enabling automated engagements or actions when encountering various obstacles. This approach should demonstrate the team’s comprehension of the problems to be solved.
“It would help us if you could talk about the method you are planning to use. For instance, are you going to use reinforcement learning, deep learning, genetic algorithms, classical optimization methodologies or strategies? Do you have any previous experience using or employing any optimization methods?” Clift inquired.
4.) The entry must articulate why the team wants to participate in the challenge and why they believe that they will develop an AI/ML algorithmic solution that will best meet the objectives of this prize challenge. Examples of previous developments in projects are strongly encouraged.
“Any prior resource optimization. Is AI/ML a focus of your university? Diversity of thought that will foster out-of-the-box approaches or thinking on this particular topic as well as diversity of academic disciplines and backgrounds would demonstrate benefits of this team,” Clift said.
NSWCDD is tentatively scheduled to evaluate the white papers Nov. 16-Dec. 15 and anticipates notifying the selected Phase 2 participant teams no later than Jan. 6, 2023. In Phase 2, the teams will develop an algorithm using AI/ML to defend their friendly ships against enemy missiles of varying capabilities.
“We need you. We need your problem solving skills and we are super excited to be working with you. I am extremely interested in the opportunity for some of the nation’s finest students to tackle a problem and come up with a very unique solution that we haven’t thought of to address a really tough Navy problem,” Clift said.