Monday, November 30, 2009

Homework 7

1.
2.

Homework 11

1. On trend analysis in relation to my project of data markup and visualization, I think it is very relevant. For, with data markup there needs to be observed certain trends in most cases to effectively markup and later process the data. Also, in data visualization, the goal would be to properly display data so that it can be interpreted well and how it's supposed to be interpreted for the audience. In data visualization, one would want the person who is observing a visualized data set to not only comprehend the data well but also to comprehend it in a manner so that he is not be misguided or deceived by the figures. I.e. the two graphs from the class session on Trend Analysis.

2. On weather prediction, data markup and visualization are very applicable. Already with data visualization organizations and governments enable people to see maps with future, past, or present air pressure, temperature, air current, cloud formations, etc. Visualization will only continue help with reading weather predictions to the public. Also, in the weather agenda data markup is already prevalent. From http://xml.coverpages.org/weatherML.html: ""WeatherML (Weather Mark-up Language) will be the global standard protocol for weather derivatives deal description data. It will enable efficient electronic processing of weather trades, allowing compatibility between systems, reducing trading costs and operational risks. WeatherML will provide increased credibility for the weather derivatives market by signaling to observers that the market is mature enough for standards to emerge." Too, for the public, people could grab specified future weather headlines in an automated fasion, so that one could find the weather and not bother to look specifically for the weather.

Homework 10

1. Nothing in science can be proven, because science is just an attempt to understand unexplainable phenomena which can only be observed in by humans in a limited number of ways. E.g. something can be observed 10^1000 times in a controlled setting and occur exactly the same each time, but on the 10^1000+1th time something could go differently and never occur again for quite some time. Man just forms patterns of what he sees and if an event goes the same for extended periods of time, he will think that it will continue to go that way and perform accordingly. Though, again, not all the variables are controlled, leaving room for error or some unforeseen event; too, not all the variables are known to man, thus leaving even more chance for error. Only in things contrived by man can there be absolute proof or acxiom, for all the bounds are set by man and there are no hidden things from outside which could effect a mischance. Though, in science there can be no proof.

2.

Homework 12

1. 1. If I had a robot and it could only do one function, I would have its function be to explain to me in human terms the concept of human self and self conscience, so that humans could finally solve many philosophical and metaphysical issues and maybe create an alternate conscience for computers which would surpass that of a human and be a greater intelligence.

Homework 13

1. Toxoplasmosis may appear to not be a threat because of its subtle symptoms, but I think it does need to be addressed due to it being so widespread and the lack of awareness people have of it. To me, having 12.3% or more of the U.S. infected with Toxoplasmosis and there really being no coverage on the news about it seems worrying. It may be that people are trying to cover it up so that there is no panic over it, or that the disease is being used to alter peoples' personalities for unknown reasons. Or maybe people just aren't taking this seriously because of the inclusive tests in humans with toxoplasmosis. Though, if there are no steps taken in trying to prevent and eradicate toxoplasmosis, it will only continue to grow and evolve, causing unforeseen consequences which may be dire. What if the mood altering properties in male humans kept progressing and eventually led to ruin? Society may not form well with males being twice as irritable, angry, less effective with motor skills, etc. Who knows what the future holds with toxoplasmosis, but I don't think a parasite would have any beneficial effect on the future of humanity, certainly not if humans take no notice of it.

Homework 14

1.

http://www.youtube.com/watch?v=Ir8U22_tzu8

http://www.youtube.com/watch?v=OL58ygztOzs&NR=1


The meteor showers in the videos are just a few visible meteors going by every so often.

I think this relates to Earth and the future colonization of the Earth’s inhabitants to other planets in three areas: transportation, safety, and material needs. On transportation, meteors could pose a problem to future space vehicles by collision or path blockage. On safety, meteors could also pose a problem with future colonization on other planets by colliding and breaching colonies or transportation areas or devices on the colonized planets. Concerning material needs, meteors could perhaps be gathered and used, thus eliminating some need for travel back to earth or other planets to retrieve needed materials. As well, meteors could be beneficial in transportation if their momentum could be harnassed and used to assist in travel to far off places. Though, meteors would seem to be an issue against colonization of other planets, because they move at relatively high speeds and are difficult to control and utilize by today’s means.

Homework 8

1.
  • Could this really happen? I don’t think this whole scenario could happen, because mutated humans, I think, couldn’t have the computing capacity to calculate the future. However, many things in the movie could happen.
  • What could really happen? One example of what could happen would be with the transit system: automatic cars regulated so that humans cannot make errors with them; streamlined and efficient. Also, there is a lot of physical interaction with humans and computers (e.g. the main character going through the video clips with gestures of his hands), not simply a mouse and keyboard which is what is traditionally used today. There is a large possibility that technologies such as SixthSense may be largely developed and used by future society.
  • What if they found genes that make it more likely for someone to commit a crime - what should they do? If someone is found to be more likely to commit a crime, he should be placed under surveillance – the amount of surveillance in accordance with the likelihood of the individual committing a crime. Another thing that could be done is counseling and psychiatric evaluation to see if someone would be up for committing a crime.
  • What is the movie "really" about? I think the movie is two-sided. On the one hand it is made out to be a visually pleasing action film. However, it also seems to touch on the moral issue of predestination and if humans have the right to act and change – or try to change – the future.
  • What themes found in other movies and stories or in the popular imagination are present? One theme popular today is that the future will be pristine, clean, and basically a utopia. Everything is free of dirt and waste in the movie, and crime and disorder seem to be well contained.
  • What methods and tricks does the movie use to make the audience like it? The film is shot so that the action is present with the viewer for a pretty good amount of time. It’s not a film that leaves the audience steeped in dialogue or left hanging in an emotional scene. And it uses various film techniques and modern day technologies to keep the audience interested in the plot as well as the futuristic society and technologies presented in the film.
2. Already presented

Wednesday, October 21, 2009

Homework 9

1. I am developing a program to visually represent data, so that users can easily access data sets and find relationships among the content of the data. The program will use XML marked-up documents and create a map of the elements with the element relationship being shown by the level of closeness of each element with each other. The relationship level will be based on a definition set generated by various users.
*I'm still working on generating the definition set. May I'll ask on a forum on people's thoughts of the tags: what they mean to them? Or I could artificially generate a set so that it encompasses a broad range of expected queries.
*The logic behind the proximity of the elements and the definition:element linking is still being worked on.

2. I couldn't pin a new idea of my project to Minority Report, so I just thought of a new idea for it. Documents with identical content but that are differently marked-up may be matched against each other to find relationships.
*This not implemented yet.

3. One thing I found interesting in Ghost in the Shell is the overall infrastructure of the movie's world that allows people of effectively communicate, transport, and integrate data. This effective use of data and communication even is the premise for the story's antagonist The Puppet-master, who is an intelligence developed in the sea of information.
*The overall project is designed for the better organization of data for easier access and use by users. I'm not sure of a new part I could incorporate that would be feasible under my inexperience, though one that is possible is an accumulation of the definition:element and relationships of elements given in [1]**. Through this combination of data sets, more relationships can be drawn and more easily customized for users with the great amount of information before them.


Similar software/technologies:
Prefuse/flare: http://flare.prefuse.org/
RDF: Resource Description Framework : http://www.w3.org/RDF/
IBM Many Eyes: http://manyeyes.alphaworks.ibm.com/manyeyes/

*Denote explanations for the answers
**Reference to number 1 of the homework

Monday, September 28, 2009

Homework 6

1.
a.
  • When will cars be fully automated? http://www.dailymail.co.uk/news/article-393401/The-self-driving-Golf-Herbie-run-money.html
  • Green energy technologies. http://www.greenchipstocks.com/articles/jatropha-biofuel/450
  • Future of markup. http://xml.coverpages.org/coombs.html
b.
  • The car has great precision and crash avoidance, including "sat-nav, collision avoidance sensors and anti-lock brakes." It can even go up to 150 mph, though it seems irrelevant.
  • The jatropha plant can grow in arid conditions and spreads rapidly due to it being a weed. It also is inedible, so it does not compete with food crops. Perhaps it's a better alternative to petroleum than corn ethanol.
  • Describes markup use for scholarly purposes, reasoning against procedural markup -- the old way -- and descriptive markup -- i.e. XML--like.

2.
a. Markup language could branch out into being a descriptive language not only for readable text data but also for images, audio files, etc. E.g. have all the faces in various pictures selected and given the tag "face." Then one could go and search "face" on the picture and have the portions of the picture which contain faces be highlighted.

b.
Local quality
Change an object's structure from uniform to non-uniform, change an external environment (or external influence) from uniform to non-uniform.
(instead of procedural markup, replace it with descriptive markup of tags, allowing for better interpretation of data from the author's perspective.)

The other way around. (instead of searching for a particular item in a data set, make inferences by looking at what items you are presented with and use those.)

Partial or excessive actions. (be verbose, don't leave much ambiguity to the end-user.)

Feedback. (use user queries on similar data sets to determine if different data can be reconciled to reduce confusion in finding different but equivalent data.)

Cheap short-living objects. (for data that is produced and sent relatively quickly that is going to be integrated into a larger series, mark it up quickly for the tags will be replaced once in the larger set.)

Discarding and recovery. (retain mark-up data even after it has been processed, for reference or backup.)

Merging. (combine mark-up'd data with visual representation through logical relevancy for a quick overview or understanding of data.)





Saturday, September 19, 2009

Homework 5

1. I am using Intrade with the username Phantasyfin.


2.


  1. A federal government run health insurance plan to be approved before midnight ET 31 Dec 2009

  2. Average Global Temperature for 2009 to be among five warmest years on record

  3. US Economy in Recession (*see contract rules for definition*)

  4. The US Economy will go into Recession during 2009

  5. Microsoft Windows 7 to be released on/before 31 Dec 2009

  6. United States to conduct overt military action against North Korea on/before 31 Mar 2010

  7. USA agrees before end of 2009 to reduce CO2 emissions by 10% or more by year 2025

  8. Jennifer's Body to gross OVER $5.0M in opening weekend

  9. A cap and trade system for emissions trading to be established before midnight ET on 31 Dec 2011

  10. Osama Bin Laden to be captured/neutralised by 31 Mar 2010

  11. Venue in North America to host the 2016 Summer Olympics



3.


  1. Too high

  2. Too low

  3. Too low

  4. Too low

  5. Too low

  6. Too high

  7. Too low

  8. Too low

  9. Too high

  10. Too low



4.















































































#shares price per share ($) total per market ($)
a. 0 0 0
b. 20 5.45 109
c. 30 9.68 290.4
d. 15 9.8 147
e. 65 0.9 58.5
f. 0 0 0
g. 70 1.2 84
h. 10 2.5 25
i. 0 0 0
j. 52 5.49 285.4




Total: $999.38




The letters a through j correspond to the listed market predictions in number 2 of the the homework.

Monday, September 14, 2009

Homework 4

1. My original question was "When will a standardized markup language be implemented for data?". This was very ambiguous and was interpreted very differently than what I had envsioned. Some ways it could be seen are "So all data is being marked up and a standard set of rules is in place for this data. It seems highly rigid and unflexible, something difficult to implement and too cubersome to be efficient or useful in forwarding human knowledge." I would reword my question, in that I would specify that not all, every single string of data being made would be marked up. Also, there would be a set of rules to follow, but these would allow the marker to work with them and create his own tags and nesting if he so chooses. Basically, the standard would emcompass pertinent information, that which would be more useful to end-users if it could be easily accessed and searched through by means of an intuitive index. It just seems very interesting to me that a document spanning hundred of pages or maybe a hundred documents 1-2 pages long could be programatically filtered and output desired information, rather than a user having to wade through familiar, irrelevant, or "fluff" information. I can't reword my question, though, without the assumption that the receiver of the question at least has some common sense. E.g. one commented "but stuff written on napkins won't be marked up", that's just silly, of course things written on napkins won't be marked up; only relevant information, that which is important enough to warrant its examination or review, must be marked up or else the process would be counter-intuitive. By counter-intuitive, I mean that marking up trifle would take more time than it would save in the whole scheme, in my opinion at least. Only when computer processing and data accumulation reach a much higher power and the need arises for much greater data accumulation and processing would marking up all information would practical. I hold that explicit is always better than implicit. If everything can be analyzed logically, then there can be drawn more easily and efficiently informational ties and unity. If there is room for error, some backtracking may occur or any amount of backtracking may not solve the problem, leading to human intervention for testing ambiguity: ambiguity only resulting from human implicit thought. However, I think, that by the time computers would be able to mark-up and systematically analyze trife, then they should also be able to think rationally, as a human. If a computer has perfect unity and flow of information, it would only accumulate more and continue to grow and store empirical data.

2. For my project I would obtain a series of unmarked information. I would proceed to mark it up in XML and then parse what information I desired. I could calculate the time it took to have regularly searched the documents and the time it took to parse the documents. I would not include the time it would take to program and markup the information in my stand-alone case. However, I will try to find if this would cause an increase in efficiency of getting the information one needs from the documents. My theory is that it definitely would not be more efficient to program this for the documents if I only had a series spanning only a few pages and if I was the only one viewing the documents. However, I think that the more people who need information from the documents coupled with the increasing size of the information load, would increase efficiency exponentially, reducing time for each of the participants. The next step would be to search for documents that would suite my parsing plan.

Tuesday, September 08, 2009

Homework 3



1. Here's two graphs of the question results. -1 represents "Never"


2. One key difference in the Delphi method is that the responses and comments are completely anonymous. In class, we could see quite clearly who has given his response and in what way he has communicated it. Too, members of the group can revise their previous statements at any time.


3. One key weakness I find is that topics chosen to forecast are so diverse among the people in the class. It is unlikely everyone will be learned in what the other is interested. There's also a bandwagon effect from a lack of anonymity. E.g. a person more viewed more respectable may influence a vote in his favor while someone in disfavor could push others away from his vote. Anonymity could be developed in something such as a chat room with each member entering having a randomly assigned username. Ignorance of topics could be somewhat eliminated as well if a consensus is met on generally-known topics throughout the class.





Tuesday, September 01, 2009

Homework 2

1. I will make a prediction of the future of the HIV/AIDS virus. Currently HIV is growing at an exponential rate, increasing as the population of the world increases. It is also predicted by the UN to continue to grow even in 2025 as pictured:



































However, it seems that the rate of growth is diminishing and will eventually level out. Too, the number of people without HIV is rising while the number of people with HIV is lessening in comparison to the number without HIV. This would lead to the conclusion that the rate of people with HIV would eventually even out, creating an S-curve. After the rate eventually flattens out, then possibly the rate would go down, creating a plateau curve. I predict it will be a plateau curve because of medical advances and HIV awareness/ prevention. Since HIV is prevalent around the world, a global effort has ensued to eradicate it. Maybe man will be able to stop it, but its future is uncertain.


2.
a.It would take about 12 years to increase productivity if the number started at 1440. By the twelfth year productivity would be 2897.563.
b. The percent per year increase is 41.421356235% if it doubles every 2 years.
c. The percent per year is 58.7401052% if it doubles every 18 months.
d. Assuming I started out at $1200 in my account, by the end of the 35th year, after interested had been accounted for, I would have doubled my amount to $2400.

Wednesday, August 26, 2009

Homework 1:

Homework 1:

1. Prefuse Flare: http://prefuse.org :
I think it's the way of the future of data representation with visual elements to help humans better understand data.

2. HDMI: http://www.hdmi.org/:
Older monitor cords and audio cables can be combined into one to effectively transfer high-quality audio/visual streams.

3. Data-mining in general: http://www.anderson.ucla.edu/faculty/jason.frand/teacher/technologies/palace/sources.htm:
Data mining is the extraction of data from a large series to obtain useful information basically. I think as newer techniques are developed and less human interaction is needed to sort out "gray" areas that often arise because of lack of advanced-enough methods for data mining, there will be an increase in efficiency of data distribution, usage, and analysis. Because data could be so easily sorted through and obtained for humans, technologies and businesses could more efficiently be developed and maintained. E.g. someone could eliminate redundancy and time wasted trying to solve a problem that most likely was solved and reported by someone else, though this person couldn't have accessed the solution to the problem due to a lack of data-indexing and widespread data distribution.