Wednesday, March 30, 2016

#BbWorld 2016 Conference -Why you should attend!

Why Attend BbWorld 2016?

Some things are best face-to-face. Like exchanging best practices with colleagues from around the world to drive student success. The best place to do this? BbWorld 2016. Where thousands of your peers, solutions experts and partners come together to exchange information and insights. Face-to-face. This is that once-a-year meeting of the minds that’s worthy of your presence.
Learn more about the 6 ways attending #BbWorld16 can help you improve the learner experience at your institution…
Learn more

Additional Bb Resources
Join BbWorld on LinkedIn
Network with other attendees and get the latest news on registration discounts and event announcements.
Justify your trip
Attending Bbworld 2016 is an investment in the future of your career - and institution. Download a customizable letter to share with your supervisor to help secure the approvals you need to join us.
Don't forget
Early bird registration ends April 22!

BbWorld is a good place to get those educational technology questions answered that no one has been able to help you with. There are thousands of Bb Guru's running around willing to talk and help everyone. Don't understand "geek speak"? No problem, you can find a peer who has already solved your problem to have lunch, dinner, or a drink with. This conference isn't just about Bb applications it is about solving education and learning challenges with educational technology, which just happens to include the Bb Suite.

Sunday, March 6, 2016

#SIGCSE_TS Day 2 Summary

To start off we learned that #SIGCSE2016 had 297 papers submitted, they selected 105 which comes to 35% accepted. Eight authors worked together on a paper for multi-institutional study of peer instruction in Introduction Computing was selected as the best paper. The really nice thing that SIGCSE did that most other conferences do not is they gave each of the authors an award, not just one to share. We also learned that the 2017 conference will be held in Seattle, Washington March 8 - 11. It was nice that when the theme Inspire, Innovate, and Improve was announce, they need not just say the words they explained how they relate to SIGCSE's goals

The days activities quickly moved to the plenary session by Jan Cuny from the National Science Foundation on CS Education nicely titled "Catching the Wave". Jan explained that Science and Engineering enrollments are going up, but Computer Science enrollments are actually going down over all. Seventy percent of the population is being missed by the Computer Science profession. We need to always consider inclusion of women, African American, Hispanic, Native American, and people with disabilities.  As Arnie Duncan stated in 2013 "Education is the civil rights issue of our time." When planning summer camps for K12 we musth think about engagement, capacity, and continuity. Summer camps in K12 often fail because students don't have the money to pay for them, don't have a way to and/or from them, and have other activities like sports they are participating in. NSF is currently focusing on High Schools and then will go back to the middle and elementary schools because they believe the High has bigger issues and since it is the closest to college the most important. High School guidance counselors have been found to discourage girls and minorities from going into computer science. The other issue is K12 does not really understand the difference between CIS, IS, IT, and computer science.

Ms. Cuny stated that enrollment in computer science (CS) courses in high schools is down from 25% to 19%. We are five times more likely to go into a high school with a football team then one that has CS courses. Education work must include bigger underserved populations and this is underserved population that NSF is focusing on. Computer Science AP courses generally are really just a year long Java programming course, but it is the fastest growing AP exam right now. Just teaching Java isn't teaching computer science. This is one of the things that lead to the creation of the new AP Computer Science principles. The board provides the frame work but schools put there own curriculum to the framework, which causes inconsistency in learning. The other issue is finding high school teachers who are really qualified to teach computer science courses. In some cases, Art teachers are being told to teach computer science. This is one reason colleges don't respect and encourage computer science in high schools. One way college professors can help fix this is to encourage some of their CS students to become CS educators in K12. Twenty-two states do not count CS toward the high school diploma. Hopefully, Obama's Computer Science for All will help with this. However, I am still wonder if it really is computer science for all or just programming for all? Let's hope it is CS! While I don't agree with Ms. Cuny's statement that educators cannot tell who will succeed in the field, I do agree with her in that people learn computer science by doing. Things we college professors can do is give dual-enrollment for CS AP Principles course as a Gen. Education credit, help get K12 teachers CS credentials, start an CSTA Chapter, HOST NCWIT Appreciation computing awards, and take our students into the high schools, and conduct CS education research. It is interesting that Jan talked about the surge of enrollment into CS college programs, yet accrediting organizations like ABET do not have anything for two-year programs to be certified like they do at the 4-year level.  Other interesting things Jan talked about was that most people in CS programs do not want to do it as a career, students want two sections (novice & intermediate) for introduction courses, lack of multiple pathways into introductory CS courses, and how University of Illinois has addressed with CS+X track.

The vendors were excellent, I had two disappointments in this area. The first was zyBooks, a vendor that I use for one of my classes and one who hasn't delivered the quality expected. They suggested meeting at SIGCSE, I went by the booth and level my cell phone for the guy to call but by the end of the day not even an acknowledgement. I guess they were more concerned with getting new customers than making an existing one happy. The second disappointment was Microsoft Research, they had their microbit, which currently is only in the UK so a lot of us wanted to see it. However, they were only setup to talk with a couple people at a time and often they were not there during the non-break time and during lunch when we had time to visit. They were unable to say when the microbit would be in the USA, just that it was coming. Normally, Microsoft does a better job at technology conferences.  After a quick 45 minute visit with exhibitors and a strong cup of coffee, it was off to a special session by Vocareum.

Normally, I avoid exhibitor presentations as they tend to be sales pitches. Thankfully, this presentation was about the  logic behind their project and research on how the product adds in CS student success. Vocarenum is a coding learning management system (LMS).  They have plagiarism detection, inline feedback, automated grading algorithms, integrations with common IDEs like Eclipse, and other basic LMS tools. They support C, C++, Java, Python, R, SPIN, & Ocave (Language), MySQL (Database), Ruby on Rails (Server), Client side (HTML/Java/CSS), and big data through Hadoop. Some schools using it now are UCSD, Harvard, USC, Penn State, Launch Code in St. Louis & Miami, and South Dakota Community College is doing C++ programming online. Wonder how their auto-grading works:

  1. instructor writes bash scripts to be run on submission and grading, Vocareum only reports grade and student report
  2. when student submits, platform provisions the right compute, creates the environment, runs scripts

Vocarenum is cloud based running on the Amazon cloud with integration into the normal LMS's like Blackboard and Moodle. I hope they are at #BbWorld16. They are working with integration for Google Sheets for a client. Plagiarism detection looks online and prior year submission, it even highlights the similarities between the multiple submissions.  There is pear review and peer grading. Student work can upload to Vocarenum or to Git and they can work on their own machines or within browsers. The nice thing is your coding submissions, grading, comments, scores are all in one place but can also be exported to the schools LMS. They have analytics in a format that CS instructors understand with aggregation rules, filters, dashboards, and allow deeper diving. Faculty can view information for course and at the student level. Yes, they do have security built into their system. 

I would have loved to hear about the new AP Computer Science Principles course, but that was an additional charge because they were serving lunch and they did not give the option of just listening. It worked out though because my son took a break from kids camp and we got to eat lunch together. I then took him to the exhibit hall so he could experience Google Cardboard. He decided that cardboard than Google Glass because it had a cooler affect and something he could see being used in his computer class. After pulling him away from Google, it was back to kids camp for more Scratch for animation creating and modifying. I was off to the special session for updating curricular guidelines for Associate-Degree Computer Science Programs. This was an excellent working sessions where we learned of ACM's desire to infuse cybersecurity into CS degree programs. We also reviewed the ACE Joint Task force on Cybersecurity (learn more here) that includes representatives from the major international computing societies: Association for Computing Machinery (ACM)IEEE Computer Society (IEEE CS)Association for Information Systems Special Interest Group on Security (AIS SIGSEC), and International Federation for Information Processing Technical Committee on Information Security Education (IFIP WG 11.8). The current curricula guidelines are from 2013 and do not include the level of security needed today. An initial group reviewed the Knowledge Units (KU) and Knowledge Areas (KA) within the 2013 guidelines and narrowed the focus and infused cybersecurity into the expectations. The leaders of he CS Task Force brought these changes to SIGCSE and polled the session attendees to see if we agreed with the initial findings and to discuss differences. We then reviewed the KAs, the biggest issue the group I was in had was remember the focus was 2-year CS Transfer Degree programs. This was because we had people from community colleges who had CIS and Applied CS degrees that were similar and wanting to try and develop one curricula for all. I was with the sub-group talking about security and at times it was hard to remember (1) we were talking about expectations of 2 year students and (2) that we were talking about CS majors not Cybersecurity majors. Excellent session, I met some great faculty from other schools and got some ideas how others are doing their programs and finding adjuncts with the required certifications...seems like SACS accrediting body may have a higher standard (Masters +18) then other accrediting bodies. The people leading this group care about CS education and are doing it right, they hope to have a preliminary report out in June. 

Next it was to hear about one schools research on creating and needing Ethical Agreements in Information Security Courses. This paper presentation was informative and a must read for anyone hosting ethical hacking courses.  We all agreed you need to teach someone to hack before you can teach them to properly defend a hack, but it is impossible  for faculty to know which student(s) will take what they learn and use it inappropriately and get the school into a legal issue.  Then I was off a session title "creating Exercises and Engaging With Students, but it turned out to be a sales pitch by Turing Craft. 

Sadly I had to miss the community college reception due to having my son, but we enjoyed eating a top the 28 story Bass Pro Pyramid, looking out across the mighty Mississippi River, and trying to pick out Justin Timberlake's house from the observation deck. The glass elevator ride on the tallest freestanding elevator in the U.S. was over so quick you hardly had time to awe over the inside of the Bass Pro creativity. To end the night, we did a couple games of under the sea bowling within one of the restaurants in the pyramid. We both decided neither of us should try to be pro bowlers. 

Friday, March 4, 2016

A First Timers Review of #SIGCSE2016 Day 1

Image result for memphis skyline 

Greetings from downtown Memphis! I have been honored to teach computer science classes at a community college this past year. A member of the Association of Computing Machiney (ACM) for more than 10 years, I finally joined my first special interest group (SIG), this one is for Computer Science Educators (CSE). This allowed me to come to Memphis for the 2016 SIGCSE (pronounced SIG-SEE) Symposium. I was excited to attend and get make to my technical .. geeky roots, but to network with fellow computer science educators.

One nice thing SIGCSE offers is a "kids camp" in which they advertised that older kids would be doing "Scratch, many Computer Science (CS) Unplugged activities, and Google’s CS First activities". This gave me an opportunity to bring my son with me, so he could learn these items and possibly help me design a CS summer camp locally and we could have our first mother/son trip in a city with history. Well, in the eight hours he was in the camp they only did approximately 2 hours of coding with Scratch with the instruction of "try it and let us know if you have questions". While the camp leaders are really nice and he has made some new friends, he was disappointed that there wasn't more computer time. I was surprised as Google was at the conference with their Cardboard virtual reality viewers for the educators, why not let these middle school kids experience them? Oh well, at least McDonald's is going to help get them in their hands by having their kid's meal boxes double as Google Cardboard. Click here to read more about Google Cardboard Happy Meal.  Microsoft is also at the conference with their microbit, but again it was not brought to the camp. I know it is about selling the K12 educators on it, instead of exciting students about it. Today, the camp has a scavenger hunt planned in the exhibit all, so maybe they will have time to see these items.

For me, SIGCSE started of with a excellent keynote from John Sweller, from Wales on "Cognitive Load Theory and Computer Science Education. He discussed our two working memories - audio and visionary. This keynote address challenged us on many things education has been promoting. We also learned that:
  1. It is better if beginning CS students don't have to learn to solve themselves instead give them the solution and let them learn first how to understand the solution
  2. Split-attention Effect - we should be using diagrams and statements to make connections, not just show side by side or on different slides, but combine to make the connection for them
  3. Redundancy Effect - we have to start eliminating our redundancy - people learn less if all on slide and restated in lectures. We need to stop requiring them to read slides to learn they didn't have to read the slide because we were telling them in our lectures. This causes working memory overload in CS majors. This goes against what most of us have been taught about telling them multiple times to help them remember it.
  4. Modality Effect - Students do better when they hear us then they do when they read instruction. 
  5. Transient Information Effect - Showing students a page of written code it is two long for students to comprehend, instead show code snippets and then have them put together. Also for complex concepts we should be using static material instead of video. 
  6. Expertise-reversed Effect - We need to switch having our students reading solutions to solving problems only after they have mastered reading solutions. 
Then we were shuttled off to the Exhibit hall for a 45 minute time with vendors and to watch Demos. We also had an opportunity to learn about National Science Foundation (NSF) support research through presentation. This was a wonderful to learn more about research that I would not have known about otherwise. I learned a lot from the NSF presentation by on Catalyzing Computing and Cybersecurity in community colleges.

The Paper Sessions were setup in 25 minute blocks and the presentations were on data structures, computational thinking, and research learning. The presenters were well prepared and willing to answer endless questions. I learned that in Germany they have the three strike rule, if a computer science student fails a CS final exam three times they can no longer take computer science anywhere in Germany. After hearing someone in the audience joke about after they failed in Germany, they came to the U.S. to graduate it made me wonder if U.S. higher education is to low in rigor. We learned based on Stanford's own students the thought that CS enrollment growth was negatively affecting quality of students was false. There was also an Open Source Software in Education panel session that was good, but I only could attend a third of it because of conflicts with other important sessions.

I was able to attend the "first timers" luncheon to hear the lifetime award winner Barbara Boucher Ownes talk about how "Service as Rent" and how from her fathers lessons we all need to be donating our time and talents as service to pay rent for our space on earth. The meal was excellent and it was a wonderful networking opportunity. I was amazed by the number of returning conference participants who paid to attend and talk with us first timers. This is something I would like to see Blackboard Inc. add to their #BbWorld16 conference.

The after noon was filled with discussions on pair programming, CS Education Research, Introductory Computer Science classes (CS0), and lessons on how to engage and diversify. I was able to attend a panel discussion in which fellow Virginia "Greg Kulczycki" participated. Check out Ohio State University's CE 2221 course, or check out Clemson's Resolve, or Dafny, or Rise4fun to select a tool to engage student's reasoning abilities.

We then were given another 45 minute break with the vendors, more demos, and more NSF projects. After downing caffeine, it was off to learn about big data, teaching with teams, program design, and Scratch. I think the best presentation was on how to use Scratch for competitions. I will be following up on this for my students. Sadly I missed the Special Session on Cybersecurity Education because it was at the same time.  My biggest complaint so far, is missing to many sessions and no one has told me if or where the missed content will be available after the event.

The evening ended with my son and I going to the Bass Pro Pyramid via Trolley. We were told they took an old basketball arena and transformed it into this amazing shopping, dining, entertainment adventure. We started off with alligator and pizza, a quick shopping event, and a fun shooting arcade adventure. The surprise was when the workers advised us to walk back because they weren't sure the bus would come back to get us. However, the streets were not heavily populated or well lite. We survived, but decided that Memphis Downtown is not very tourist friendly unless you of course are on Beale street. One suggestion would be for the conference organizers to consider transport to key locations on a schedule to help prompt evening tourism in the evenings.

Keep up with all the happenings at SIGCSE via Twitter under the hashtags #SIGCSE2016 and  #SIGSE_TS.

Monday, February 8, 2016

Additional Cloud Options for #Bb Learn Ultra

Two new Software as a Service (SaaS) LMS offerings were announced a few weeks ago (January 21, 2016), only six months after they were promised. However, the new offerings– Blackboard SaaS Plus and SaaS Advantage– join the already available Saas Standard.

Here is a description of these new service offerings from the official press release:
Learn_SaaS_Plus_and_Advantage_Options_-_Blackboard_HelpThe new Plus and Advantage tiers give institutions the ability to scale, customize and configure Blackboard Learn for their specific and individual needs. These tiers of SaaS also offer more support for commercial and custom-built integrations, additional service tiers, and a new Flexible Deployment Option. The Flexible Deployment Option allows schools to have control over when new features and enhancements are deployed to their environment. Customers using Learn in a SaaS environment have access to significant benefits such as zero-downtime upgrades, continuous delivery of regular bug fixes, new services innovation, and increased scalability.
Blackboard also provided a FAQ and a nice table comparing the three Saas tiers. Unfortunately as of this post, Blackboard Ultra is only available to institutions that offer their Learn environments in a managed hosting or cloud (SaaS) environment. Self-hosting institutions must continue using Blackboard Learn 9.1, a fact that took many by surprise when it was more clearly explained at  Blackboard World in July. Many of these institutions, had been making concrete plans to upgrade to Ultra in a self-hosted environment. The other disappointing news for "self-hosted" institutions is that the new modern look and new "wow" features are not and will not be available within traditional Learn 9.1. While this is in large part because of the technology the 9.1 version is built on. I personally believe it was in some part Former CEO Jay Bratt's attempt to force all Learn clients who like to be an early adopter to move to a SaaS solution, because he did something similar at AutoDesk. The good news for 9.1 self-hosted sites is that Bb has promised to continue support for the product at this time.

Why do I think some people may have missed this announcement, due to the big CEO change that happened. Read more about that at   While Dr. William “Bill” Ballhaus has been busy talking with Bb employees and selected clients directly, he has also been interacting with people on Blackboard Community System. Just to test to see how responsive he was to clients, I sent him a direct message through the community system. Not only did I get a message back, it was within just a couple of days. I now look forward to hearing Bill talk to the Bb MVPs and seeing if he will listen more closely to what Bb Clients are asking for or if he is just there to make Bb easier to sell off.

Saturday, October 31, 2015

Should big data analytics be used in conjunction with opinion surveys in Education?

In a world filled with data and most companies starting to realize the possibilities of what can be done with big data analytics. Why is higher education and others still solely making decisions on "client opinion surveys"? Why not at least support client survey results with big data analytics?

Webopedia defines big data analytics as "the process of collecting, organizing and analyzing large sets of data ("big data") to discover patterns and other useful information. Not only will big data analytics help you to understand the information contained within the data, but it will also help identify the data that is most important to the business and future business decisions." According to the SAS Institute Inc "big data analytics is the process of examining big data to uncover hidden patterns, unknown correlations and other useful information that can be used to make better decisions. With big data analytics, data scientists and others can analyze huge volumes of data that conventional analytics and business intelligence solutions can't touch". According to Margaret Rouse (2012) big data can show true "customer preferences" and that one of the goals to using big data is " to help companies make more informed business decisions".  TerraData states that when big data is done correctly "it is the coming together of business and IT to produce results that differentiate, that power you forward and reduce costs. Big Data is less about the size of the data and more about the ability to handle lots of different data types and the application of powerful analytics techniques" (2014). This means "smarter decisions cut costs, improve productivity, enhance customer experience and provide any organization with a competitive advantage" (TerraData).

So why isn't everyone using big data? Rouse (2012) suggest that it is besause they have "a lack of internal analytics skills and the high cost of hiring experienced analytics professionals" who know tools like Hadoop, Pig, Spark, MapReduce, Hive and YARN. ThoughtWorks Inc. point out that companies need to shift their thinking from the actual data to insight and impact thinking and trying to address unanswered questions. Schmarzo acknowledges that educational institutions are interested in using big data for showing ways to "improve student performance and raise teacher/professor effectiveness, while reducing administrative workload" and to compare one institution to another, but no mention of us on the business side of the house or to learn current LMS usage to compare against a possible replacement. van Rijmenam's infographic shows the benefits on learning, but still no mention of using it for software changes. Fleisher, explains that some institutions are not using it because they have a concern that acknowledging that they recording all learning activities and releasing results may harm students if this data got into the wrong hands.  Guthrie points out that big data in respect to education needs to go"beyond online learning, administrators" need to  "understand that big data can be used in admissions, budgeting and student services to ensure transparency, better distribution of resources and identification of at-risk students." (2013). Perhaps one could classify technology application purchases as a student service, but I do not think that is what Guthrie is referring to.

Coursera was the one place that mentions the use of big data in education for more than learning. Their course description says includes the statement: "to drive intervention and improvement in educational software and systems". So way aren't leaders doing software comparison, including LMS reviews required to learn big data techniques? I think it is because the top academic administrators are afraid they would find out that some of their decisions based solely on "pilot survey results" were made based on inaccurate data.

For example, Lets assume a institution was currently trying to decide between two LMSs, "The pilot consisted of 11 courses and 162 students. With 39 students, 5 faculty and 1 TA responding to a survey, when asked whether LMS2 or LMS1 was better for teaching and learning the results were":

LMS2    30/4567%(Faculty only 5/7)
LMS1  4/459%(Faculty only 0/7)
Same  5/4511%(Faculty only 1/7)
n/a - unsure 6/4513%(TA only 1/7)

Additional Notes: that there were only ll courses for this single semester to use LMS2, out of a total of 2,094 courses. Only 162 students were included in the LMS2 test, out of the total 3,991 students enrolled and only 5 faculty and 1 TA was included in respect to the 780+ faculty on payroll.

At first glance, the 67%  sticks out and some may say that is a strong indicator that an institution needs to switch to LMS2 because only 33% wanted to stay with LMS1 or were not sure LMS2 had an increase benefit to change. But that 67% is a percentage based on those that responded to a survey not the number that want to switch. The table says out of "7" faculty yet in the text the person stated that only 5 faculty and 1 TA responded, and the last I check 5+1 is 6 not 7. If you take the total number of participants compared to the number of surveys completed, the 67% is really only based on approximately 27% of those who participated in the pilot. The student population is only represented by ~0.04% and the faculty population by ~0.007%.  What about Staff or business entities that use LMS1, they were not represented at all in these results. Other questions that come to mind and decision makers should be asking are: (1) did the faculty who's courses were included actively uses LMS1 to the fullest?, (2) Were the faculty included tech savvy?, (3) Did the included faculty have a personal issue with LMS1?, (4) What actual course included? Were they freshman courses or senior level courses?, (5) what is more important ease of use for faculty or better learning engagement options for students?, (6) Had participants been properly shown how to use LMS1 as they were LMS2?, and (7) What were the features of LMS2 used compared to the used features of LMS1?

I this basic example shows that survey results alone allow for skewed reporting, but add big data analytics to opinion surveys and education decision makers would have a more realistic picture and better decisions for most important stake holder, the student. Garber provides other examples how people are spinning survey results to get their way. In his examples he talks about how some people cherry-picked a statistic describing just a small percentage of a population to make things look better than they are and decision makers need to ask "What did the rest think?" (Garber). In a 2012 paper talk about the need to develop an approach to detect research interviewer falsification of survey data. But that the detection approach was not limited to interviewers and could be applied to basic survey analyst. Robert Oak points out that falsification of figures is more common place in his article about the New York Post claim of falsified unemployment figures.  Johnson, Parker, & Clements stated in their research "Likewise, satisfaction that little or no data falsification has been detected previously should not serve as an excuse for failure to continually apply careful quality control standards to all survey operations" (2001). Fanelli's 2009 research showed that "scientists admitted to have fabricated, falsified or modified data or results at least once –a serious form of misconduct by any standard– and up to 33.7% admitted other questionable research practices. In surveys asking about the behavior of colleagues, admission rates were 14.12% (N = 12, 95% CI: 9.91–19.72) for falsification, and up to 72% for other questionable research practices" which would make one think that there is a prevalence of researcher misconduct or did Fanelli mislead us with these results?

Schmarzo states "In a world where education holds the greatest potential to drive quality-of-life improvements, there are countless opportunities for educational institutions to collaborate and raise the fortunes of students, teachers, and society as a whole" (2014) by using big data along with old fashion surveys. The benefits of big data can be felt by all organizations.


Do Projects Fail or Does Project Leadership Fail?

Graphic Courtesy of
       I am sure almost everyone working has been part of a "project team" at some point even if only to plan a going away event for a co-worker. Working in a team can be challenging and yet it can be an amazing adventure.

      When I ask the question do projects fail or do project leaders fail,  I am thinking of work place projects that have an impact on a company, a community, or the world. According to an article, HR IT projects have a pretty high rate of failure. Michael Krigsman, says, "Depending on the statistics you read, 30 percent to 70 percent of these projects will be late, over budget or don't deliver the planned scope." No honest project manager will claim that they have not had at least one they were running not fall short.  

      Sometimes projects fall short because of unforeseeable events, but more often they fall short because of things that could have been fixed with the assistance of upper management through education or common sense. Below are a list of a five common issues that I have see cause projects to fail over the last couple years.

1: Big Egos
More often than not in today's business world people have massive egos. Some people even confuse confidence level with their ego level. Confidence is something everyone should have in the business world, but when a person lets their confidence believe they are the best and they have all the answers so they do not have to "listen" to or "include" input from the rest of the team the chance of failure increases. 

I have seen to often when someone with a big ego is forced to let others help, the work environment struggles due to lack of trust and anger. The ego driven employee often struggles to remotely respect their project mates.  Management has to ensure that every team member's ego is in check and have away for team members to address issues in a supportive environment, even if it is the project leads ego that is the problem.  Team members may stop contributing and time will be wasted not only resulting in failed projects, but lose of revenue for the business. 

2: Hidden Agendas
Project team members need to know the real goals and outcomes of the project. This means for upper management, project managers, and team members not to have professional or personal hidden agendas. For example, the project member or upper management may see the project as their professional ticket to promotion or their name in lights. I have seen team members use a project as an opportunity to boast their personal friendships or try to oust a co-worker they feel threaten by. I have seen upper management use projects to get publicity in hopes to get a promotion. This type of behavior will only lead to an unproductive team and an unhealthy work environment.  

I have see projects where the leads have vision of how things should be done and while they put on the front to their superiors that they are open to success and feedback they prevent anything from moving forward unless it matches with their vision exactly. This automatically cause resentment among the team and will cause project failure. 

3: Overestimating Work Involved
While it would be great if all mangers and project leads could just say "do this and have it done by X" and every employee would comply. According to a report in the Houston Chronicle by Kate McFarlin stated that leaders who set unobtainable goals lead to project failure. McFarlin explained further that it is nice to have great expectations, but those expectations need to be realistic if a team (or project) is actually going to succeed. McFarlin's report recommends for leaders to break task into small ones while on the path to the ultimate goal. This will help ensure the project progress is easily tracked and moves forward while giving the team small feelings of success.   

The other issue here is that when the project was defined the person or persons who initially determined the amount of time it would take to complete the project, had no real idea how to create a project timeline or did not have the area of expertise to really determine the true amount of time involved in each area of the project. Other things often not taken into consideration are sick time, weather, technology delay, team members availability, and communication ability of the project lead. Good projects timelines consider the "what ifs". 

4: Insufficient Resources
The lack of a needed resource or resources will cause a project to fail or be late. This include things like lack of staff, lack of direction, lack of money, lack of support, lack of equipment, lack of office space, etc. Without all the needed resources upfront or secured early the project is doomed from the start. 

It is the project leads responsibility along with management to have a contingency plan for resources. If sufficient resources are not available, then backup strategies must be developed to do more with less or find creative unplanned ways to secure the additional resources. This sometimes means that the project lead must be willing to ask for other team members to solve a problem and to give up control temporarily or for them to actually stop directing and actually get their hands dirty and do the work themselves. 

This often occurs when management or team leaders do not include all parties who the project will ultimately affect because in the planning stage they did not have the forethought they should have. Instead of admitting the mistake and moving on, they try to hide their shortcoming by blaming others and making excuses. 

5:  Lack of Project Management Ability
You do not have to be a PMI certification to lead a project successfully, though it helps.  Nor do you need a PhD to lead a project successfully, but you do need logic, communication skills, and some basic project skills. Munns & Bjeirmi (1996) defined project management as a process used as a control to achieve the project objectives by utilizing the organizational structure and resources to manage a project with the application of tools and techniques, without disrupting the routine operation of the company.

Usually projects are divided into phases the most common are initiation, implementation, and closure. Each phase has checkpoints with assigned dates. The entire project should be tracked on timeline that can be shared at a high and low level to all involved. So a successful project lead will know how to properly create a project timeline, set milestones, and monitor the progression often to address any trouble spots before they cause a major issue. I have seen projects fail or become sloppy due to lack of a project timeline from the start. A new project manager should at minimum have completed classes on project management or have been mentored by an experienced PM before they take on a large project. 

Communication ability is critical for a project manager. If a person talks down to people (on purpose or not), cannot explain themselves, or is just a nasty person they should not be placed in a project lead position. Let's face it, some people are not good with talking to other people. Some people may be better suited for working alone or doing task they can just report on. These people should not be put into a project lead position. People who will use personal and professional friendships, but really have never managed people or a team before should not be given major projects to lead. While some will say we have natural born leaders, most of us have to learn it. Lack of communication causes delays or even failure since team members do not have the information they needed, issues or changes do not get escalated, project reporting becomes sluggish. 

Project managers who talk about other team members behind their backs create an unhealthy working environment. Project managers who points out other team members short comings to upper management without talking with the team member first, is a bad leader. Project managers who do not realize that their teams success will ultimately be their success in the end, should not be leading projects. 

Just because someone says they have past connections or because they have been with the company for years or has a good idea does not been they should be a project lead. Being a good project lead, which is basically a manager, requires certain people, communication, and common sense skills that not everyone has.

         It is imperative for a project manager to understand all the stakeholders, this means listening to them not just meeting with them. It is equally important for upper management to admit someone is not a good project manager even if they have amazing vision or ideas. Human Resources needs to ensure that all project managers are trained in ethical and legal management issues, so they end up putting the company in a legal situation.

        In order to avoid big potholes it is important to monitor milestones and address things that happen quickly. It is vital for the team lead to remember that the success criteria is how the project will be judged by the rest of the team, users, management, community, and world. A good project lead realizes that criteria or the initial project plan probably will change overtime as the project moves through the stages and they need to adapt to these changes instead of trying to ignore them. Good project leads realize that there is more to a project than just time, cost, and quality. Benefit for the organization and user satisfaction are also keep players. 

Wednesday, July 29, 2015

BbWorld Promised Versus Delivered

I had the opportunity to talk with some employees of Blackboard Incorporated (Bb) during the #BbWorld15 conference last week and after reading blogs/tweets about the company being behind on product delivery and repeating statements from the 2014 conference, I decided to re-evaluate what was promised in 2014 versus what was delivered in 2015. Let me first being by apologizing for the length of the post.

Just do a search on #BbWorld14 and #BbWorld15 to find lots of information. For 2014, you might read a post by Kristine Putnam here and here or a post from Jason Rhode here, here, and here. If you don't trust either of these Bb MVPs then how about reading Michael Feldstein's (e-Literate) post found here or maybe the company's own BbWorld 2014 recap site located here. For #BbWorld15 check out the conference recap site at, or Michael Feldstein's post here, or Sue Watling's blog here. You can also look at the Tweeter hashtags #BbWorld14 and #BbWorld15 for more information. Regardless of where you read about the two conferences, it becomes clear in my opinion the follow was promised and delivered.

BbWorld 2014 primary items promised:
  1. CEO Jay Bhatt's promise the company was changing direction by embracing a Learner Centered model
  2. Removal of Java from their Collaborate product
  3. Redesign the user interface (UX) of their flag ship LEARN product
  4. Investment in mobile improvements
  5. Increased K12 innovation 
  6. Improved product quality in releases and updates
BbWorld 2015 primary items delivered:
  1. Company was so student focused they failed in communication with rest of client community
  2. Collaborate Ultra has no Java
  3. Learn UX redesigned (aka Learn Ultra) w/responsive design
  4. Company change in direction not only self & managed hosted clients but a SaaS solution 
  5. Mobile Learn out and Bb Grader out; Bb Student, Bb Teacher, are Bb Parent are in
  6. SchoolWires and ParentLink for K12
  7. Collaborate building block released for Learn during conference, Collaborate Tech Preview offered to attendees along with Learn Ultra Tech Preview during conference, Community Site Beta launched, Redesign of Help.Blackboard.Site shared, Learn Ultra released with limited initial features, and hiring of Peter George for quality control.
From this summary comparison it would seems that Bb Inc. delivered what they promised. But why were clients unhappy during BbWorld last week? My thoughts are the unhappiness is due to missed communication, incorrect assumptions, and poor message marketing. Which lead to Tweets like the following:

First of all, clients missed 2014 announcements about products like "MyJobGenie" announced during Jon Kolko's presentation because we focused on Bb Grader and the "early demo of revised Mobile Learn" (click here to watch video). To be honest, I missed the MyJobGenie announcement about this mobile app that lets students learn about careers, perhaps I was to focused on the BbGrader and Mobile Learn news. 

Lets now look at some other examples that lead to this client confusion.  Below is an image of the Learn redesign shared at the 2014 conference captured by Kristine Putnam:
and this one from Jon Kolko's 2014 presentation of Mobile Learn:

Now look at the image below of Learn Ultra that is on Jason Rhode's Blog for BbWorld 2015:
As you can see the images shown at both conferences are very similar, so yes clients may quickly say it was the same thing as last year, but it was not. The UX shown in 2014 was a sneak peek not a working model, but when shown 2015 it was live and named Learn Ultra for SaaS. Kolko's 2014 display of Mobile Learn, was in very early stages of development, as stated at least three times during presentation, and ultimately came out as Bb Student. The same holds true for the 2014 New Collaborate without Java, morphed into Collaborate Ultra. Yes the 2014 images are similar to 2015 images, I personally think Bb marketing should not have taken the easy road and just reused images. While it was stated in 2014, clients missed a key point that in that what was shown was only a goal and very early in development designs. Another issue were assumptions on the client side because in the past whatever was shown at BbWorld became available for use shortly after. The Bb Management changed approaches without making that change clear to clients. The dropped communications with client community about status of development like Learn UX changes only for SaaS, Mobile Learn demo was going to be changed to three improved mobile apps, and the demo done with Collaborate was just an internal development site not general release are key to understanding clients lack of excitement and trust with Bb Management right now.

Note only during this time of product revamping, eight players left the company (Maurice Heiblum, Mark Belles, Gary Lang, Michael Bisignano, Brad Koch, David Ashman, Mark Drechsler, and John Porter). While these people all landed on their feet, it did slow development and client community advocacy. Additionally, as Michael Feldstein pointed out what Bb is doing with Ultra is technically difficult and why "big companies like Google, Microsoft, and Apple are still in the process of working out right now" (July 25, 2015). This added layer of complexity and Peter George's thumb on quality assurance has added unplanned time in development. 
Just as clients started getting improved product communications seen within detailed roadmap slides, within this post, thanks to Valerie Schreiner and through Bb's new Jive Client Site, it is "leaked"  that Providence Equity Partners LLC wants to jump ship on these innovations and the Bb client community. This move maybe because providence was expecting Bb Inc. to regain loss market share faster, instead of holding around 44% (Koner, EdSurge, Pappas 2015) or the company is just not the cash cow they thought, or Providence is scared of the little 3% Canvas market share increase (Koner, EdSurge, Pappas 2015), or it is just a marketing campaign. I  am not willing to jump to conclusions that it is because Providence has lost faith in the innovation or leadership of Bb Inc. If that was the case, wouldn't Providence just fire Jay, his top management, and stop Ultra development?

To Bb Inc., I apologize for being on of those clients who made assumptions on what was shared during BbWorld 2014, for not seeing the difference between vision and real product, and for missing announcements about things like MyJobGenie. I bet Bb Inc, never expected that from me.  BUT, Bb management team, I am holding you to promised improved communications with the client communities, improved quality all product releases, continued innovation and dedication to Learn 9.1, forward progress on Ultra and competency-based education, and inclusion of all key education roles to enable everyone to be learner focused.