Computing and people who work with computers are not the nerdy and negative images often portrayed in the media. As a computer scientist, educator and project evaluator with my hands and feet in many fields I live these realities every day. I am like the kid who never stops asking “why?” In this blog, I share my questions and curiosity about the interdisciplinary role of computing with a special concern for how computing can make the world a better place.
Showing posts with label scientific computing. Show all posts
Showing posts with label scientific computing. Show all posts
Sunday, April 28, 2013
"Predictions are Hard, Especially About the Future"
I was listening to a talk recently when this was said by a someone with unquestionable technical expertise. Given that I know what he meant to say as opposed to what he actually said I will protect my source. The point is of course, that you can guess, you can calculate, you can run all the statistical models in the world, and you might be wrong anyway. After all, the future is a result of lots of moments of now, and my now and your now and other people's now and the now that comes after the next now and the mere fact that I give you a clue about a possible direction might change your behavior and hence that direction.
The stuff science fiction is made of. Did you know that it is not at all science fiction that DARPA is funding a design project to see if we can develop a propulsion system to get us to Alpha Centauri in 100 years instead of the current 65,000 years? (Perhaps the goal is 1000 years - I may have misplaced a 0). Think about the development of a backbone array of receivers stretched across the galaxy. A different set of interplanetary protocols are needed. TCP/IP wasn't designed to store data (why should it have been?) so when there are delays....whoops, lost a packet. In trying to beam Dr. McCoy from Planet Earth to Alpha Centauri, we seem to have misplaced a few of his body parts. Just as Bones predicted would happen. I gather, then, that the idea is to develop interplanetary protocols with a store and forward capability that will withstand episodic disruptions and delays.
Not so far in the realm of the future, in fact, in the here and now, is an internet enabled surfboard. And it has been around for a decade. At least in prototype that is, because I can't find anyone, including Intel who created it, selling it (see one news article here) . I confess to having mixed feelings about a wifi ready surfboard anyway. Surfboards are a dime a dozen here in Southern California. On any given day, unless the ocean is flat as a pancake, you are likely to find the dedicated out on the waves. More often than not, they are just sitting there, hoping for that perfect swell. Assuming proper conditions, a good time to find them is between 7:00am and 8:00am on any given weekday. At 8:00am they pile out of the water, strip off the rubbery outfit and put on their suits over their sandy selves. Off to work. Last week, I was driving down the freeway at about this time, when a bright green surfboard came bouncing along - it must have come loose from someone's car as they boogied to the office at 80mph.
Speaking of high speed, I wonder what data rates you can get on a surfboard? Is the surfer more likely to miss the perfect wave if they are busily updating their status on Facebook? Personally, I prefer to leave all that behind when I don the rubbery outfit. Maybe others do too. Maybe that's why I don't find the internet surfboard for sale even in my local high end surf shop. (Excepting perhaps Los Angeles and vicinity, I can't think of any place in the continental US where a wifi surfboard would be more likely to make an appearance).
Then there is IPV6. You techie nerds out there heard of it? It held it's world launch in 2012, yet to date, penetration is at about 1%. I agree with Vint Cerf (see that link two sentences back): we should stop making excuses and implement it.
It is super cool that as of last fall, non Latin domain names were approved. Cyrillic anyone? Speaking of the future, at the time I located that article, it was dated tomorrow (i.e. one day ahead of the day I am functioning in). Of course this has to do with time zones in Europe vs. San Diego, but it seems very in the spirit of things that there is a post made in the future about a futuristic tech topic in a post about the future of technology.For just a blip in time I can see the future.
Have a wonderful now.
Thursday, March 17, 2011
Computing has an Important Role to Play in Earthquake Preparedness and Response
When devastation as large as that currently happening in Japan occurs, it can be hard to know what to say or do. If you are like me, you have been reading the news daily (or more often), caught up in a mix of complicated reactions. This morning for example I watched computer generated weather simulations of possible flow patterns of radioactive contamination (via the BBC). As the simulation looped over and over I couldn't help but be transfixed by one large multicolored plume as it slid like a mutant amoeba over Southern California. Right here in other words. The colors registered different levels of radiation. Computers generated those simulations and unsettling as they were, I'm glad to be able to see them. It is better to have knowledge from a reliable source than no knowledge, even when that knowledge is based on probabilities and a great deal of the unknown.
I was very grateful for computer science when the recent earthquake struck New Zealand. A friend lives in Christchurch and it was only a matter of a few nerve wracking days before a brief post appeared on Facebook telling all of us that she was ok - no doubt considerably freaked out, but ok. Thank you to the computer scientist creators of social networking.
The situation was very different in 2004 when the tsunami hit Sri Lanka and someone I know was very near the coast. It was over a week before we learned that he and his family were alive. There was no email access, no smart phones, no Facebook page, nothing but waiting and telephone calls to the US State Department (who were terrific by the way). The computing communication infrastructure either was not there to start with or had been completely disabled by the dual natural disasters of earthquake and tsunami.
Watching the developing situation in Japan, the triple disasters unfolding as nuclear contamination possibilities are added to realities of earthquake and tsumani, watching the weather models, I was reminded of the researchers around the world who work full time developing models of earthquake simulation and who perform seismic hazard analysis. They work on these models so that we can know as much as possible about what can happen, how it can happen, how we can best prepare, where to erect buildings and other structures and how to protect them as best we can.
Developing 3-D and 4-D maps and models are classic computing problems of large scale data analysis: selecting and applying the "best" constraints, knowing that the model you develop will depend upon choices about possible epicenter (location of the earth directly above the underground origin of the earthquake), focal depth (how deep the origin is), magnitude (amount of energy released) and possible paths the seismic waves may follow. There are innumerable factors to include or leave out of this type of model such as local and regional variations, ground type, land masses, rock type...just for starters. It is all about improving probabilities and predictions.
The paths of seismic waves are not always what one might expect. For example, one reason Los Angeles gets hit so hard by some earthquakes on the famous San Andreas fault is because there is a natural "funnel" that directs ground motion directly into the city from a section of the fault well east of the city. Complex modeling and a solid knowledge of the land revealed this important information.
You have to know your hardware, firmware and software; you have to know how to work with the latest and most sophisticated networks of high performance computing. Operating system, algorithm and programming language optimization. Databases to hold all those data and sophisticated networks to link the distributed grids of computers.
If you have an interest in earth science and scientific computing (you don't need expertise in both - this is where collaboration between fields comes in) then here is an area where you can work to make a difference in people's lives.
I was very grateful for computer science when the recent earthquake struck New Zealand. A friend lives in Christchurch and it was only a matter of a few nerve wracking days before a brief post appeared on Facebook telling all of us that she was ok - no doubt considerably freaked out, but ok. Thank you to the computer scientist creators of social networking.
The situation was very different in 2004 when the tsunami hit Sri Lanka and someone I know was very near the coast. It was over a week before we learned that he and his family were alive. There was no email access, no smart phones, no Facebook page, nothing but waiting and telephone calls to the US State Department (who were terrific by the way). The computing communication infrastructure either was not there to start with or had been completely disabled by the dual natural disasters of earthquake and tsunami.
Watching the developing situation in Japan, the triple disasters unfolding as nuclear contamination possibilities are added to realities of earthquake and tsumani, watching the weather models, I was reminded of the researchers around the world who work full time developing models of earthquake simulation and who perform seismic hazard analysis. They work on these models so that we can know as much as possible about what can happen, how it can happen, how we can best prepare, where to erect buildings and other structures and how to protect them as best we can.
Developing 3-D and 4-D maps and models are classic computing problems of large scale data analysis: selecting and applying the "best" constraints, knowing that the model you develop will depend upon choices about possible epicenter (location of the earth directly above the underground origin of the earthquake), focal depth (how deep the origin is), magnitude (amount of energy released) and possible paths the seismic waves may follow. There are innumerable factors to include or leave out of this type of model such as local and regional variations, ground type, land masses, rock type...just for starters. It is all about improving probabilities and predictions.
The paths of seismic waves are not always what one might expect. For example, one reason Los Angeles gets hit so hard by some earthquakes on the famous San Andreas fault is because there is a natural "funnel" that directs ground motion directly into the city from a section of the fault well east of the city. Complex modeling and a solid knowledge of the land revealed this important information.
You have to know your hardware, firmware and software; you have to know how to work with the latest and most sophisticated networks of high performance computing. Operating system, algorithm and programming language optimization. Databases to hold all those data and sophisticated networks to link the distributed grids of computers.
If you have an interest in earth science and scientific computing (you don't need expertise in both - this is where collaboration between fields comes in) then here is an area where you can work to make a difference in people's lives.
Labels:
Earth Science,
environment,
high performance computing,
interdisciplinary computing,
modeling and simulation,
scientific computing
Subscribe to:
Posts (Atom)