08 Feb Post SC16: Top 10 Things in Retrospect
My post-SC blog is here! I wanted to share my thoughts and impressions of SC16, what I found interesting, and give you an opportunity to share your thoughts.
I wrote a Top 10 of what I was looking forward to, prior to the show. You can see that blog here:
1. The Student Cluster Competition.
The Student Cluster Competition did not disappoint! There was a power failure, which required the teams to do a checkpoint. I think the teams knew the power failure was coming – not sure if they knew WHEN it would happen, but they didn’t seem rattled by it. Or maybe they are just that good and developed good systems.
Another exciting piece from the SCC was that for the first-time ever, the team that won the overall competition also won the award for highest performance for the Linpack benchmark application. The team that won, “SwanGeese,” is from the University of Science and Technology of China.
Great competition for the 10th anniversary of the SCC at SC.
Radio Free HPC Reviews the SC16 Student Cluster Competition Configurations & Results
2. The Technology.
For technology, I predicted FPGAs and liquid cooling would be the technologies to watch. And I’m standing by that prediction. Why?
Did you notice there was a section of the show floor where most of the newer liquid cooling vendors had their 10×10 booths? You know – that section farthest from the heart of the show where the last booths were sold? That was my first indication that many new liquid cooling players decided they were finally ready for SC in 2016.
“Faster” is the game in HPC. You can achieve speed with GPUs, FPGAs, or faster CPUs. GPUs have been around a while – you go to NVIDIA and that’s where your GPUs are. FPGAs have also been around a while, but recent market actions are now making them a more viable option: Intel’s acquisition of Altera, the maturation of the OpenCL toolchain, Microsoft’s adoption and use of Bing in their data center, AWS adding FPGAs to their cloud offerings. The faster the CPUs the hotter they run, so if you are going to achieve speed with CPUs, you are going to have to find a really good cooling method, and that will be liquid cooling.
Oh, and the plethora of articles on FPGAs and liquid cooling. The media can’t stop talking about them! So – FPGAs and liquid cooling. I just don’t think they are going away.
Just a few of the articles…
Custom Liquid Cooling to Support Canada’s Largest Radio Telescope
Aquila and TAS Energy Launch Liquid Cooled Edge Data Center Solution
Asetek Liquid Cooling Delivers Savings and Flexibility for HPC
Asetek Sports Eight Installations on the TOP500 Supercomputer List
CoolIT Systems and STULZ Debut High Density Chip-to-Atmosphere Data Center Liquid Cooling
Green Revolution Cooling Helps Tsubame-KFC Supercomputer Top the Green500
Nimbix & Xilinx Accelerate FPGA-Based Workloads in the Cloud
One Stop Systems HDCA Supports 16 Nallatech 510T Accelerator Cards
3. Women in HPC.
I am proud and honored to be part of this organization. The schedule was full with BoFs, panel discussions, workshops, a networking reception, and a meeting of the Advisory Board. Toni Collis, the founder of WHPC, won an award for outstanding leadership, and WHPC won an award for diversity.
I ran a workshop panel discussion with Sudip Dosanjh from NERSC, Kelly Gaither from TACC, Debra Goldfarb from Intel, and Kathryn McKinley from Microsoft. The goal of this panel discussion was to share what your organization is doing to improve diversity and inclusivity. The comments from the panelists and the questions from the audience fit firmly into two areas: Managers who want to improve diversity, and individual contributors who seek advice when they don’t feel their organization is taking diversity seriously.
I had a couple of takeaways from my work with WHPC:
• We, as WHPC, need to become a more diverse organization. This is not just for women!
• There are those who want and seek advice. Then there are those who don’t think diversity or inclusivity is an issue. I’m focusing on those who want to learn, and will continue to present the facts to try to open the minds of those who don’t think there is an issue.
• As the marketing lead and one of the strategic advisors for WHPC, I will be doing a messaging exercise. We need to make sure our message is crisp, inclusive, and demonstrates what we are really trying to accomplish.
4. SC16 Committees.
I worked with the Diversity Committee, the Student Cluster Competition, and the overall SC16 conference, helping with communications and social media. My key takeaway: it’s a lot of work but it’s rewarding to be part of what makes the conference tick. I also got to meet a whole other group of people who you don’t see when you spend all your time on the show floor. The researchers, the supercomputing center people – basically, the people that are combining the HPC and the science. Simply fascinating.
And I got one of these. 🙂
5. SC16 Show.
I am embarrassed to admit it, but until this year I had never been to a plenary or a keynote. I went this year as part of my SC16 Twitter role (active tweeting…), and also attended the award ceremonies. Simply – the plenary and the keynote were AWESOME! If you didn’t go, watch the Plenary and Keynote. And go to those sessions next year! THIS is where HPC is being applied to science.
6. SC16 First Time Attendees.
The SC conference brought back the First Time Attendees session. The room was packed! Bernd Mohr (SC17 chair), Dustin Leverman from ORNL, and I shared information on the conference and how to make the most of SC16. I suspect we’ll do this again for SC17.
7. SC16 Conference Numbers.
11,100 registered attendees descended upon Salt Lake City. The technical program spanned six days. The exhibition was the largest in the history of the conference with 349 exhibitors from industry, academia, and research present on the exhibition floor. SCinet had 56 miles of fibre and delivered 3.15 terabits per second in bandwidth. This is One Big Show.
insideHPC did a nice recap. Read it to get all the details!
I did my fair share of networking, as I always do. Too many parties for any one human to actually attend. I hit a couple of them and did some good networking in the lobby of the Marriott at the convention center. My favorites: Intel’s party at Rice-Eccles Stadium & Tower (that is some really green grass / grass alternative), the Goodnight Cluster Beowulf Bash (Meh!), Nimbix’s lounge party (with seriously the best food / beverage combination of any event), WHPC’s networking event, the SGI party (The Spazmatics!), Mellanox with their always great entertainment and food, and the Data Vortex event with their Golden Ticket. I know I missed many other events (the SC16 Tech Reception Thursday night, for one), but there is only so much time in a day.
There is just never enough time to talk to those I want to catch up with, the new friends I want to meet, time to go to the sessions and visit the booths. If I missed you (and you know who you are!), then I guess we’ll have to catch up the old-fashioned way, via email or text. Just kidding – we can actually talk via voice.
10. Post SC.
Yes, I went home tired. I closed my office to give my team a chance to rest after the frenzy of work leading up to the conference. Personally, I took a stack of books and shorts, and found a lounge chair in a warm location where I read, and when I got up, I did some hiking. It was the perfect way to get recharged for the post-SC work.
For me, I have some HPC Advisory Council events, ISC17 (and another Student Cluster Competition!), maybe an HPCast, an HPC on Wall Street, some SC17 meetings, I’d love to go to PRACE Days, some related industry events like IDC, Gartner, OpenStack Summit, DCD, or Strata + Hadoop World. Suggestions welcome!
This blog was published by InsideHPC as a special guest feature.
|About the author: Kim McMahon has performed sales and marketing for more years than she cares to count. She writes frequently on marketing, life, the world and how they sometimes all come together.|