Sunday, November 6, 2016

Review: 2016 Social Media Conference

I attended the 2016 Social Media Conference yet again this year. I think it was the best one I have attended, with each session giving me little tidbits of information about how to better manage and utilize my use of social media as a professional and in my classes. This is not to say that the conference can't be improved. I would like to see stronger session on leveraging some of the analytics available in some of the platforms as well as to see a track dedicated to academic research revolving around social media. But, from top to bottom, this year's conference was the best I have attended.

For my first session, I attended a session on "In Introduction to Social Network Analysis" bu Vivian Ta (@VivianTa22). A doctoral student from UTA, she presented last year and was back again. She described social network analysis as mapping and measuring the flow of information. It enables you to identify connectors, influencers, bridges, and isolates. The methodology allows you to track the spread of disease, sexual relationships, collaborators, business, law enforcement, etc. The value of social network analysis is that it allows you to identify data and patterns of information flow. There are two types of social network analyses: egocentric and socio-centric. Egocentric revolves around a single person to demonstrate economic success, depression, etc. Socio-centric social network analysis focuses on large groups to describe groups of people to give insight into concentration of power, spread of disease, group dynamics, etc. There are three types of measures used to analyze social network relationships (referred to as centrality measures): degrees, betweenness, and closeness. Degrees refers to the number of relationship connections each node has. Betweenness refers to nodes that link together multiple groups. They represent single points of failure and can be important linkages in terms of allowing information to travel between groups. Lastly, closeness refers to how quickly information can travel between nodes. As for actually gathering the data, you can utilize direct measures such as surveys but this can be time consuming and difficult. Indirect measures can come from organizations, citation analyses, co-authors, memberships in organizations etc. This is a cheaper approach but does not indicate how nodes are truly related to one another. Still another approach is to use data scraping, web harvesting, or data extraction techniques to pull data directly from user's social media accounts. For example, you can harvest likes, follows, friends, reply to's, retweets, comments, tags, etc. See OutWit Hub for an example of a data scraper. As for analyzing the data retrieved, there are several FREE platforms: NodeXL, Pajek, UCInet, NetDraw, Mag. e, Guess, R Packages for SNA, and Gephi. You can examine the frequency of interactions, types of interactions or flows, as well as the similarity of characteristics (status, location, educations, beliefs, etc.). This was a nice presentation that gave me a research idea so in my mind, it was excellent.

The second session I attended was titled "How to Use Social Media in Higher Education to Tell Your Program's Story" by Dr. Becker and Dr. Putman. The started off by citing Hoover, 2015 who found that interacting with current students and staff through social media is more effective that interacting with them face to face. They stressed the need to align social media efforts with strategic goals and that you need a dedicated social media person to handle efforts 10-15 hours per week. They used a work-study position for this which is funded mostly through the federal government, thus reducing the funding requirement on the part of the department. Start by developing strategies and goals and develop a consistent message across platforms. It would not be wise to have a very conservative message shared through one outlet and then a very wild approach on another. Develop a hashtag for your department. They noted that it is challenging to consistently post, create messages with audience appeal, and dedicate time of an employee. You should post photos with your messages as it leads to significantly more likes, comments, and click throughs. Short posts and pose a question also tend to be more effective. Finally, the best times of the day to post are around 9:00 AM and 4:45 PM so, 1-2 times per day. This was a good presentation, which gave me some good ideas for managing our own department's use of social media.

The third session was titled "Twitter for Educators - Network, Learn, Grow" by Yvonne Mulhern. She started off with an interesting quote from someone I did not catch: "Twitter makes me like people I've never met. Facebook makes me hate people I know." She talked about using Twitter for professional development, self-promotion, and the use of Twitter Lists. Follow people you admire, leaders in your field, etc for professional development. Tweet publications, research ideas, etc. Finally, follow other's Twitter Lists which enables you to more efficiently follow leaders in your field. Follow HigherEdJobs or Chronicles Jobs. The idea is to develop your own personal learning network (PLN). There are also several tools that enable you to streamline your use of Twitter: Twistori.com, TweetReports.com and do not forget to look at Twitter analytics.

The fourth session was presented by Dr. Goen and Dr. Stafford and was titled "Remaining Legal and Combating Trolls on Periscope, Facebook Live, and Meerkat". They limited the discussion of Meerkat given their recent demise. They first defined a troll as an individual that visits a page to post insulting, off-topic comments to provoke some sort of emotional response. In general, they identified two different types of trolls: spammer trolls and disruptive trolls. Spammer trolls are all about them, selling their stuff, etc. Disruptive trolls spew insults, often sexual, violent posts, etc. Regardless of the type of troll, there are ways to combat trolls. On Periscope, you can set a post to follow only which means only those who follow you may post a comment. All others can only watch. You can also block users on a live broadcast. This does not remove a negative comment but it does prevent them from making additional ones. Finally, you can set a broadcast to private which only enables those following you to make comments. There are similar tools on Facebook (FB). On FB, yo ucan report/block a person from your page. The report aspect provides FB with information about your objection and allows you to continue to view the other's account. If you simply block a user, you do not have to report anything but by doing so, you can no longer see the other's account. You can also adjust your general privacy settings to limit what others can see. Alternatively, you can customize each post to limit its exposure. Finally, you should only accept friend requests from people you know.

The final session was over YouTube by Dr. Mitzi Lewis. Titled "How to Make the Most of Your YouTube Channel", Dr. Lewis boiled it down to three essential areas: Brand Your Channel, Be Found, and Keep 'em Watching. For Branding Your Channel, know your mission and post videos that are directly relevant to it. If you need to deviate, consider adding another channel. Make sure your banner, name, and icon are consistent in terms of terms, colors, style, etc. You are trying to give off a consistent image. Have a good, relevant channel trailer that is, ideally, 30-60 seconds. Finally, include channel sections to groups similar/related videos. For Be Found, make sure your thumbnails are images that are relevant to the video content. She gave the example of Jimmy Kimmel videos in which then thumbnail is usually an image of the guest being interviewed. Make sure the titles use words that people will search for. Put the meat of your descriptions early in the narrative, so they can be seen without hitting "more." If you want to identify good key words to include, google.com/trends/explore can be helpful. Finally, keep tags short and meaningful. Finally, for the Keep 'em Watching group, Dr. Lewis suggested maximizing watching time by hooking them first and then let them watch. So, given them a tease, intro, and then discuss your topic. It is extremely important to hook them early. If you use playlists to organize related videos, limit the number of videos in the playlist to the teens or so. Use a watermark which will exist on all of your videos. Finally, regularly upload videos so subsribers have something to visit. For other helpful tips and tricks, she recommended creatoracademy.youtube.com.

This year's conference was better than years past. Again, I would like to see an academic research track included as well as some strong sessions on analytics. But, this is a nice little conference that is applicable to this in higher education, primary and secondary school, as well as businesses and other organizations. So, plan on attending next year!

Thursday, November 3, 2016

Review: ISSA 2016 International Conference

As a new ISSA member as of late last year, I attended my first ISSA conference, conveniently located (for me) in Dallas Texas. Being used to academic conferences myself, I was not quite sure what to expect. I knew it was going to be more practitioner based and practitioner based it was...largely from a fairly managerial perspective. So, that was in my favor. So, here's a recap of the session I attended.

For me, the first day was the weaker of the two days but the first session of the day was the strongest. It was titled "Architecting Your Cyber-security Organization For Big Data, Mobile, Cloud, and Digital Innovation" by Mr. David Foote. He discussed the importance of aligning business and security objectives and that part of making sure that happens is having CISOs report to Boards of Directors rather than to CIOs. He argued that part of what makes managing so difficult is due to churn within the field. This is a result, he argued, of being spread too thin and burnout as experienced cyber-security specialists are constantly having to re-tune because of disruptive technologies such as Cloud, Big Data, IoT, etc.). According to his research, cyber-security jobs require more certifications than other IT jobs and that there are roughly only 1000 top level security experts compared to a need for 10,000 to 30,000. This brought him to the point that we are in deep need of "people architecture", an alignment of people, programs, practices, and technology. The benefit is an optimization of assets, improved decision making, minimizing unwanted circumstances, etc. Finally, Mr. Foote, discussed the need for consistent job titles and skills across organizations and industries. The lack of such a consistent job definition makes it hard to compare, for exam, a system administrator for one organization to a system administrator for another organization. Mr. Foote did an excellent job presenting and I would highly recommend attending other presentations he puts on.

The second session I attended was not quite as good but I did still come away with some good content. It was titled "Improving Incident Response Plan With Advanced Exercises" by Chris Evans. He stressed the need for "pre-incident" training in order to develop muscle memory. The goal is to stretch beyond just compliance. He described several ways of doing this: workshops, table top exercises, games, simulations, drills, and full scale exercises from least to most complex with the more complex yielding more tangible benefits but require more investment of time, resources, and expertise. The first step is to develop the objectives so that the people that need to participate can be identified. The key take away was that we need to evaluate > test > assess > drill.

The third session on day one was titled "Cyber Law Update". The presenter struggled on this one. She was neither a technical person nor a manager of technical people. She was, I believe, an insurance person. But, she found herself being corrected several times by the audience. Nevertheless, there was some good content to come out of the presentation. One of the key points was regarding the establishment of FTC authority as it relates to cyber-security breaches. She discussed LabMD who was not liable for the breach but rather, for the failure to take "reasonable" security measures. Another valuable contribution from this presentation was that the inability to show injury is what stops most law suits against companies from being successful. Emotional distress does not count. You must show some sort of physical injury. If you cannot show what or how much you lost, you have no case.

The forth session of the day was titled " Posture Makes Perfect: Cyber Residual Risk Scoring". This one was interesting. The presenter was a little unclear in terms of specifics on his scoring model but the general idea was a calculation that gave you a residual which represented risk. He briefly mentioned threat maps and displayed one by Kaspersky and mentioned the Norse Map. I have seen these before but never spent much time looking at them. Having said that, in looking through my notes for this blog post, I Googled them and ran across a site that lists both of these as well as several others. These are pretty slick and can be interesting and compelling when trying to discuss how pervasive security issues are. He also reference the over referenced (his words) Sun Tzu's quote about know your enemy, know yourself, ... While he made the argument through the process he was advocating that you could know your enemy, he started off stating that given the complex threat environment, that you could not know your enemy. This seemed more realistic to me. There are nation states, organized crime, hacktivists, cyber criminals, etc. This make make it seemingly impossible to know your enemy with certainty, at least without a delay to properly investigate. There are just too many possibilities. But, it does suggest that we need to develop methods to more quickly identify these sources so that we can more adequately combat threats. He finished up talking about there being lots of standards and lots of certifications that demonstrate or express proficiency as it relates to assessing, developing, and implementing security in organizations. Despite all of this, breaches continue to occur. Touche!

There was a fifth session for the day but I had to leave. Day two was really pretty solid. All the session were quite good I would say. For my first session on day two, I attended Advances in Security Risk Assessments". Presented by Mr. Doug Landoll, he started with an Einstein quote: "We cannot solve our problems with the same thinking we used when we created them." He talked about the threat calculation, which ever one you use, needs some sort of data. You can get that data from many different places. This may be as simply as a survey; "Do you have a firewall in place?" ... He stated that CISO's are in high demand and that if you examine job requirements on job posting sites, the requirements can all be boiled down to "reducing risk." In order to determine risk, the process for determining a risk score is important. You have to examine controls that are in place. For example, what is the hiring process like? You need to establish physical and logical boundaries to your assessment. You also need to apply a legitimate framework. In his opinion, some "frameworks" are not frameworks but are really just a collection of a few best practices (i.e. SOX, HIPPA, PCI, etc.). Legitimate frameworks include COBIT, NIST, ISO 27001, Cyber-Security Framework, FISMA, etc. With a framework identified, you need to have it mapped (hopefully it is already mapped by a good source) to a standard such as PCI. His point here was that standards and regulations are not frameworks. He then pointed to an article he published on LinkedIn. For assessment, he mentioned RIIOT: review documents, inspect, interview, and observe. Combine multiple approaches. To do good assessment, you need objectivity, expertise, and quality data. Finally, he plugged another person's book on quantitative assessment (Doug Hubbard). Follow this presenter on LinkedIn.

The second presentation on day two was titled "Culture Changes, Communicating Cyber Risk in Business Terms." One of the panelists stated that technology was similar to dog years, referring to the speed of change. The concept of nation states launching cyber attacks is recent. attack surfaces have mushroomed. It was also pointed out that the boundary of the enterprise is becoming harder to define as we rely more and BYOD devices, cloud services, etc. When asked about some of the recent drivers of culture change, the data breach at the Office of Personnel Management was brought up as was the Mirai DDoS attack and Dewall. The interesting thing about this last one was they were held responsible, not for a data breach, but rather for claiming through advertising that their systems were more secure than they actually were. Another example of driving a culture of change was ransom-ware and the interaction between victims and hackers. The panel concluded by discussing some of the existing standards (NIST, ISO 27005, etc.) and the focus on IT security risk and that we need to refocus on enterprise risk instead. I read into this an alignment of security and business objectives.

The third session I attended for the day was titled "Stepwise Security - A Planned Path to Reducing Risk" by Wade Tongen. He described the "de-perimeterization" of organizations and how that makes securing them difficult. Per the 2016 Verizon Data Breach Report, 63% of breaches occur as a result of weak, default, or stolen passwords. He mentioned the need for identity assurance because users have multiple identities (i.e. personal, professional, privileged, non-privileged). There is a need for consolidated identities. Fragmented identities result in sticky notes, use of same password for multiple systems, spreadsheets, etc. Use multifactor authentication EVERYWHERE. Organizations need role based provisioning so that applications, services, licenses etc. are all associated with a role so that when that role changes, access changes accordingly. Finally, the speed with which we can identify perpetrators, maximizes the chance of being able to do something about it. He used a convenience store robbery as an example. If it is robbed and you can give the police and accurate description quickly, they are more likely to be able to do something about it than if you can't provide them with evidence (such as video surveillance) for several days.

The final session I attended on day 2 was over Mr. Robot and whether or not it was an accurate depiction of a hacker's perspective. A panel session, the consensus was that it was. I left this session as I did not really see much value in the discussion. Overall, it was a good experience. It was new to me. As I mentioned, I am used to academic conferences. But, this was a nice conference to attend; one that I can do some further research about some of these concepts and take bake and use in my classes.

#BCIS5304 #BCIS3347 #ISSEConf